var/home/core/zuul-output/0000755000175000017500000000000015145546303014533 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015145556426015506 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000301474615145556233020276 0ustar corecoreܖikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9GfB ?Eڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W5!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kf^?·0* TQ0Z%bb oHIl.f/M1FJdl!و4Gf#C2lIw]BPIjfkAubTI *JB4?PxQs# `LK3@g(C U {oLtiGgz֝$,z'vǛVB} eRB0R딏]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/X_]F@?qr7@sON_}ۿ릶ytoy͟מseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)LPeP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7쿻чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2ah?lm$K/$s_. WM]̍"W%`lO2-"ew@E=0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iOӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆSܫ1z.x62%z].`Gn&*7b/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRWon%*44nc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8GwMm[eG`̵E$uLrk-$_{$# $B*hN/ٟCZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@twml"Ms>\΋"?|NKfֱn !s׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O K?m\+*/D*W;~ک)H׼l_ e6d5(fM_7>8Q=J"2<~Dƪ0lA,ƪ$2N5cU#BRĕȲ(J=66BcUJ&\.!$xB UASVssg*8+nϒ/yTH!Uaj/o"kGq*Sh-ţ2R,& 7D(}Z?/Ǵ`SAQ4 < w{< rDZdZj ^uDZ,TsɨRb/HZ~$-*_ND^~~S)D=wߕ%C|JbB%C|SS-A |ex7 noQ:#yDQ`C`}c|?[5-vmH//Q9XP!!ˏ |V^LfU}GD2&u#Ongd"YGA>QIab V,iq6!*nb?Q*/3Z2.kx=]-Qy(U?3:<<ڒXGIB?va=bs@[QH|<y3-Qv](qHQ\Z~wDɄ+(Te<qւ̪H[z2|ºCO {0֬%(oJCfcP(2YS?)(V9yDtI>5 וՇ0WQoXs)'.׳{+ռf*Z-u^yl$ęjU鳥hM!N є{XBW~;̂b`6ު򨞨~8IKK(!DM%x;ugjFgU vO-(Gb@b/()ˇs–Wr:9i~@ewߖ1,ԗ֜ʺ !;w xVJUq'$dӿq6|8g6xYg1y֡P8‹i}աnĂ3])A]gubMg}~ʔg-^/p;gR{ ozEրg|.śT&!_"a0D-gsU3)dSe-RI嗸0 <<KՇ)-˰O;<Y^z>"F^dJCvyZ$(g7w]rC7yƋj(( $8>CAA4>2;W rJᮝCݍmkaR$UM{NsRF`Y/]Vs)ǧZ-rFCzSr ԣ۠ն|s*]%'y):b/wR@n@O 6e3!O|DCD5nO6,(gip/<'w4Sefzu~)3J<<]W#ъF5wGdP4y^hb̊2g`fPw2+y&tu o (U P;#dn2~Ckд]eE\n[](4*k6pxнp@vDKX)نTeo1Ϋf$Sv}xJqh}؏P֯7DkoI,W㰬y:֞8O8'GY%fZJݗ3rp~r"[RSR~>ѥt+}wx3m)߿o9,,ՕfOgv h@OeUVS7̐==e1gV~l>hF::~<F碍RtQ"v67صM\F+=|vmC8qۖq ^lBP5[gb{B|Os9glO\%@ϳ=|ھ`42,(OOղ ٺV.s4e5@bŬ  Iy6T`C &u.ՠ/ Hcm)GLԞ%Ukk}; S,ْn2M)Yw3Sw.= Clg8ʎ&Ȓqma0ۂKkc[XH6ϳk ctRWya_#IZ!XuMk / %[nپdt-e#T72S>4.3ySK) J<' Ex @&@UبSPb\aq  m Xn[#8*ފ:Y-p[G0)yPu?խMˈԺc^]OtyJH:FϷ)FUy4CLZiaE<[pxzْ`OMk lF,P:\As[P[5%jZ|XMbiJt*<7no\0lj\+ܲה#-Q]Eg̎|ӷHzoE!u=˅g:s]{% AZc$f%6a:j}[fR%֫$\P ڠtnQKw(D-]V2,XEGR18dm X@qM R4Iw|MWo?L$:lu.GϲץlqĨVWӢAf)e֖:K֓˓e[V3;swptu+OO]] .rOuzi2Mg@:>놱L+KeT+0HՃw^M=@`tL%0Oc1ܙΧƪCI!wLЬ*eU6'hq{DpxQ "f מ980WUN*# (TgA, Ӆ(U,3ݍ{J%u֤-1lY7qDޚ40Om Mɑ)-v u\Vn&p-L_b!SYVyAZMk_Ψ6H5ں3*uRuejіXU,-+b9t\5y4. R՝8"O7ʨf#VFm]+U#G$˜-QLZrK,\%k0~5ѻjq#35 [f|>}fLc~ #ߩ=:Hf>p3!Fb:x{0bRLO{R[5ŤSo"곺M}|UQ r|q &f 9UD5@NQ A10CbanN̈oHIXf!F 7YEÎ^cb3Ƙ(p\PvB$y,D~њB ]췿NKA`<=Ȩd+ړA߳ikZޏ[İ5x߽?E]>QGĨMcD dw|"v#!֛QDuzʟPt|A?DyM."Ն`V>U~]|G|fs:M@K(aX AB.1?QZEY!N t.fh,=ʃPբ49hC8GEoO(kϕv>_Q6)IB13Xf}Vy ׭&@0o <Ǔ?`|(i[ I-`0C V) r˛FW^DX4v a~n 04Vow~ "r/y A{s-@.m"mw8륉3B?9,F[]w (]chxp4VE`wӊ.x7! lax7h_fS)/]Ey@C}o0|tܜw0h^Z:4>{i 8~0na]Tth =hXơ&@tr+^&p^Qtbk2 XAW|w2կ X/p_ ݪG6u-1?8n兩8_nbw1 wJokت[%`s_b2]!_/x0 PUF|M1B•qWѕNU #ΫFPӵdJ7iޒ1lr{}_#\k'K)'knsF_s3j6(eg3sq[-{q~BõO  9 p[~tES夵/װx* kNu -p0UN':%:/h)iw3%8hGh]EL%=>@]!5səڮg*ja4 @jڞ=4:,g`rK=4N8sKef;0MQͼǸm9/0֋;q[vlnC\r_]knkht) pn4h7:L /֑3.p}HRqq;.!ÇX6EWҪktW`BKiuf3 H`jB)Hݹ#+f Su^DD֨xٵΥ2ȂA6>1*n,Js2ʳ 1WQ>A v0VXָ+Q311p}XMP;siyW6ũ*QYWԩ0'QLVR`-Y;|myآgR˄'uC )'೏p1hZed FC w:F-pniN0(s|ڌs֛8༜`02FXp#>)7`n&6(r@bϨ_dCa8xu9R~w4=RӮgg$ugVS__⌙ #"Ez1;/'ZasFfh IML"S fW UVgr>`j?WEx4;E+\*22g}F> d!~^q"Q woYV56tG̻;t]/hJl@889$WhM`_Q`Y跱IzyKxlB*-3]u$:J,R7 nqͷq;+p.W/'YJX]'I:O˂L`> 2tD 7R je6kU>fY8U.7@ZM lZ޺)7d V 2 *]fAwcj͂`F(YG r4.FdKmLN3s3ϵobdpL+~19cQKm۸NR>p0[#1[İ msfQ7ɟoe=Bvt8e᝺&FU SɆhЪ,.12iOw)Q']ț*Lx>㌘;\1m/cS9c@V.q˧@TgBٞ'R?,*G Z &Bi, vŭLWJH}_rFa6W#.g:]=R}=frnpޡm*#fSQF1s|۞9/E5fS̰H8dv78{QxC[ A"]rqʚ/BR8T4&0]7W:˴۔jBRmRnCtI@#5Ԭ넰$0N୦fkS(LD8 }/ lЉBG͔]MMشJ4M@G**Rkt @$} 2JBo;LUoe]leRKtYK' XƖwdj۪'t^ ĕ X ڢ{ԵgϦhH,v6 >LjڐPiLR\/*`oUlUPl{Ae; ʶ^mڔШ쉭vmVUoحZv}jܸ۴*ZOQRk AUA-~[ l!uvBPwUPw Au&zz[m/4A_BP{A (|UP|{A; ʷ4X4B`{A (hFХ\r8JY[ iv=% ŭ^gN]ڷb;{Q:zhN^<-yN7r+8ʀI+yA?>DF\8mӪp+bÓ\q*A<PQS@ Y?qY!< ):Zj"TlWO6_ !ڱH1" :t:K{ha|%0 ,ACwg8Q:H@Y8 )xCx<)΍FhA[H H=_f??ZV]ǒ! \NW)޺g`CJڸ, 382Q,eQzY[gݷz:X`;\<1̷bp ʲ(@ybv)&"&W-ۥG)ܛHCL} |=qe%X;Cza7_ϲQ̄DQ|phsI' 24[[C }GiОqNOD<҉J@*-orB߭V“!d5Z0`=p6 ` "dn2E@|EUq|ig%4ưzU!|S>.OAz =`%Q90Vʒ4)ޥf6^=]4ySޯ՗J}Vc|$eMfysgT '3% xlXtmj71%$$m÷p,Kfr!}U4ʚGV{}\\{'@% Ӟqņb1(wmD)'Fa)rOө>,b򽆃De9nMf"qvx5P،6sP4fH$4:S @#^g-w /'2 DVA]Xx0&y  e]{4GrkV,+fzQNZKeg N PoUլҪm8k VہjaIrTQ"53{؃Njf9mwM_FYa~Γ(y@~[NYƓ<<_;Vb<ےrz R\A/#VQViHF< #s ,+DHފ9يt^LvT(0{؏q7iZǞKOyATUՠjtkƞ飢l?t}3?&߾;|=PGێ4ܯhCcBx7AeÎތ:z]_!(W##-҈)Wʊ"/FBgC2F;F V 1yQmDH++C nx&l J`@0S%D9MU||տoyOT8#_iZ۫Rkzq `du {ohu F\tC.]Bm\BY)A6eSO0D A5gk6[^5 \뽋ZY!In,D㞔s7XT(Bh޾'Xg,Nj*di^EU53AArx^O)w-͍(tOOdǞ'  $g;-B;Ve 21\~cc, q[Ѥo$Jk4v~s䙡l7=!>vGa2^3rv;L\;c?MV>a᭱j}am *O>qSDOSpE8}D!#~ĶNѯ%OpxJ>KJY /?+W9Uqsyj._8f?5X6+TUNGcx.L6 }S3;EmD k;?_Cfu?lhyt{b0:Z\[~S8{w^SgAÍ{4\kbR^~n?-\jU);ٿX R~/mļf_ֳg)4hs@ٮזϱFƏaH xp=U2+k ݮ~kp )Ka(%A@VS&}10=qƀ x<-2KwHP4ʰdLjE4$K{Xm҂AjYA*95/nC$Oy7[P xs FSh:bp9 GÑ" %rB`M+k3z$:NRiU23:bG4F C14%B{UXx%BL0I)o+_C J͙)- As^$5 Aҕa~p+;:TZaQ}{8'Rf:k.hHO}TG?V5̼x ,YMx"#pڌ]5PpAE+2-Xi 21Ec@?lQ˛7 _hfXhYƯcQ6AήN*xC5ՇouK;Ŋؔ rkA2AyFD!`G]̙k!⦲]og:AՒ;e;rYdY9txkW72YÈђf) QD R "b%5IJK{S i@THjamd]2iڄ}ky}遠5S|oBċz%K/Aba v6[\l %EٖX>8;#S2J5O@N$5Ӎ"%BDf ]`_hSˤPN=ccukl-HcC;ƅ2@dZRφ&\%C8ݺކeɯ3uYt6Sm?TV& "ʷ>>8e?l~=a-eͼ\>8 YwtN>̺C9pul+yRP0¶~E`cN2E҈PfHSia]@4c/U$~Ǩ"6^HE(b:wxSjla-VmPIRHIF&J9^TN+tDuP1)m!RQ$n<^L֍A];:a{1 Z MttZ@Zu _]; %$֗(+/MTNa, m5iywu`|&(΅9g)ޞY^B쪬@nC3S8;s@`=N2ojQleWD/+$yKĀ1e)-%&ںJ8x0֠D[a 1BS3.+;~}h .ZWm2v=䷵xBkK!\=6niN9'o!x֞N! S~L8= GR{M  /eU.c3Ώvq}n~f}d]3⦒՗1EJ3m7xΝNs>iɖ<3ll*pmj1D]Ht݉G8x`: 7а1DSDA16+čj~jBꂤ777ܿy_h-R _tS70#|C!_bvԭ9bB7'KT0V@OƘEC֛"zv.8n\u8t^zQ+rLsef"heN,82XfԎ%pj0K%8}їQqґMR$u44j0Rf HjɎU)ۮWi'VTÛ"zwh5pp"z?#rI*~StnD AphM.rwteW _&1>nMLN?ŤIڷLJҾ?.8 uS4hǔF 8>n?vrqKT. ))!ݯ6D5}l^.TӢ,TJڨv0R"L:_^.h z:Iʇ &']Z )Q9֭ ӾS#G ho˯%(~U ?{GgGeeZʠxS^|2˹jKuh Rrx"[GgC .gHFbNM輹 tq޹D?x m)1,M&JG#>rב sxHeD[?ݭ6Na[ >FkNت.3"ER(ŰYl"U5<+4cFg(#1mpRe;cFҴΦ0WDЊt<ٛ^FîKp>TWV\[]P؜D#e#KIfHe8^@>lۜp[PfIV C=z֍8t ,",IK).8݆iSqHZ:-u7A5QS`gA 2/~ XZ M4)~#z҃oɌާ0йc-EW^JF^x2g 7:cĘ}4υ, uخ]p#֮KuXmPN G.Ƃv[l,3H +X.8f [+K xZ>$7<%cRDa5x1RK Rv-rK2OrsBࢰ,uPbM.݆Ы<4>|;tK Hrhoj],w@YZ֌mB zgQ:!"$I@{@;!>,Ex,{6Ek/1şLto-E |(Sטg_I#ۧFf-T,JiS8ye΄YemUhK ozH'tkǂǸpUL Pj:mFݺ*S1IWJxWHo-%<vnrBj3t>q] TvpF0gW D\˚HUFqh-97:(9๪aX+DUJmhAmC Hupg#F`24go]vSѺN%_^sJ/o/ݦ~jPV~qʱwÛk?GoON%.@iuϫBJR;,Mx)T%]]\(/cXo'zc7u\ .ކڲy 2a3} ޵?mcmw-v:N6דA[V{RG$zv&c" ~89Vaxw¸IV W)鮳V ܱI19צ)=1ZwO Q`_c+W=Zo.e^]oG_]|;ǐ,J"%~fb{X-5zʰn3a,68@Y+p&I&&6qmH~7 ,dʸ@ƭR]'RIv(r-4:P$}ʫ754iԡH]W8in5N1htAjT[ߦFkĈx *%m?wۦ|fV?M+?q\~07K+;ۯwoG٬ +g0жPFzJ;`hNӯvvlG# N |.V5jjej`e[\I81`x 1zpK9K 4gKUo)ͺnk&j~;c4-FSzoz)aEnt\$QԬ2?HܥIkesqֿEP`?/RrFpxlIV 1%҃ZexAG=p(<jM9?`tSO`ŷ.,bOLF8UGZ?]='.Bi4i?zWRQ_5 7%XVtj87Gu=@wkr .tt<8`Fͯ\*f}Ϳgá96Tx=?MȺ7& ؒYMߌkatGULw|}Ӊ+;xc(pI5^__RBcswU8XUWs$5Λ ԉR\x[mg]{u=ɉ$=/7eG0FēqY|Rh}9.)G+!Ⱥ_@ :Q$aD1{NJcCo%1,Rs*s;^FQ%k`^J6k߂{Ds:@$B8`f43? (pJsSX*}bUp̿gAhgyՒxvoV2(7*]t'h஧A/v{諒oL5\?PVmWŊkըW`,𧻓c̼IČMFA矋x @\lt/֥6vWƯ eH`-PF֋DV]A27'wHϋ<\Rx݁5]\w@*py=e1 +Ýr&(HK^TA{YyR4t~Lx[ 1(IbLIΪDa#"ZJ;,+vp04Dk"+NOgs'H4{-5w,ĝMUߋ|J<\.EKol&? bDٔbKʼnズ/)&YRMPͪ9HKYF:8jba]oY^PRt#Be@$HɝJbݤ+X4(Eո1Ә`7woXS,vw|ւ.UL'軋Rb&iũ0k%՜ { Fk]Ba] 1/NK%_xϜ\'}S,S]x3 4R<}!xyWPcc~ ź4%m3> zf|1mB_YORxۊߟ)(LN';|T|[νh53FaiH_F2"j~3d-%_܌U\?em9/OकlpQ};C`މܾEܻ$c F Mhq-!0> \Wl }h_c#&wyL^+~גpDyA0qO;~a\ rhܬ!?z[傋yp[&)}OWn,at1߁j9uP\ ?8flǓQY橔Tdž+.qT"8-xF '_RIXRnv(MS#>w4'^֭(Ox`jq8[vVi,]yPf,⪚Atljr.IM0ĤJ; RZN%RԥXu+u+4$T'x-2/=j14VC',*U(@L(l LpbRǹq<'Z\ZZ\k1ˆJ0iBԩDƈASCZɰ8q!`RΑ1)ο[Q-ͨ[5QV9jߚlhHiAnjK,I3 SJ%ʚQZQGj1}fuߖͫ溎L֠A?/ԠWȦ yi>A?8]>dnYmE]8$L:оd}P'{Ý7DĻVQu"O+2vł~VYɿfl|j: | `>^ *h5vڲsXxL]-^O+pO80=Z\񭩴'5yտM3M޹BW!ٯLHUȿ%{[T0r>r؝\e!J)V B `S}v}:  k4ǯ6UЃ=M`lIG g]kaa7*ʱwpv@D ] ۆCD87w>o.݌ee rhJϷ!ty"' "ڈ!CioE?tvqeLn n# ;\v=fT0, K$rR0)r}y}jAYg:j+l:X+/uRw{u&\6/ ugj9gr܌E@Mhݩ:զ5:[2ۼ |ugׁb{;ogc^Iw%(H5!K؋ȦCq%mۖv/r%MlʔQDRi~Mk1N5tgIL1KE> ,h!nI 6uO!L(ҲQG1¬9XU)ES8 HTR-M9<VbDuFНcNR.R708Xs E櫍g 4(Bb>8nQ$Ts'#rS9DXNYCIc*Q4e6U8?-fHcFKhEVn2C0G)Qi,@poW7mR טZαIb!9 ('O Єı,i*p4h J8Bib/ ;DDKӚ'(OtȠ$3>_S0Ʃ1~❦1uJF~PBin$Y)1)oGvϞ[sy:v_o]Foݧf[;cd;]y 3}&/C0X(싙qTn@)>:>):4`3T3?e iDubaRHePoXͱS~@!g~> [ڼz{Tq%wbGtyZ軠'Ѯ)| ];z2kmI>\ŵ"5j{9ǩumsI8`l~%H WI$~cN|j2Y1E#Q\ƙbw5 Zz9rwBTJ0\Y;㵽#:LJIYro=BD.=}} }RpU`I$,8erCIтyaC G,.ׁ{/lϩO8NXfEnD :\p-,(XK-9#gg)7 VfVqjDZ P0[Xu\& JEUP8a(aP 8 #49[mV0nlfjGH0tAIHnpPDHŔ_<`Ċ5\8Bw`p"ERv#,W^ Wp"/!c{~Y6fR/uTC D$_ "^"c$W99cTw pǸ, HI_);o^[Q.D^Q'Z81/`4QOqsdKeG'Id荲9xVK6h4й42z0qb\AmS+4PkR+Cqk _k~ɞJMdg}¯m@'i+f_ۀOx=閺mo9C62!*Q24`\w6_Rl<7#8' ƱpVw 'ϭ "^B噔L}"ꪚV;0I a.`9byOT՚EqHme /#<:M\-8/'kW CʃiǍLX8!P.DAh>2(&eKrC+B3[v2cx DPq?ra;F4BEBaD{ʱV2bv0+!{ vO; ڲkł!WqJhL3ls%r9^N΃/ίvH!~&"r[Ȓ8ڀiopFi7b7bDI[oT|t$TT X[7_6`XYy<5țXL(ЖgeH!ڙ\޲BۡV1\VIN^"ZLI>@>DE\~Q8 5jRQ#fJ9AWgmļYurvJpA@0{ĭmBYu g^hnM&􈳁[-A[ED#+vZ>~5R'5u1h-W@;hs p];zcVM-? #;MZO]Tk4bGQi1Ȧ> MӉx_$5^,<>yy}ezn0/}07]s+/B6}yCY ۗM)TW!-TDp+]g/5V)l}PJ*an wYLº?m\O+:]߈րar)*Ey-'4fpV*)@0G>hmh єZ#Q,TTf'?z Tp(Uݪ'"c媴..c;΍d>"Uj~bHn+uہX_l`⊢঳m\>q}\n6Z 4R TT![N \_} n]nXÜRX7G]|*Pu7,,KِScji8.,b9nOƮ,[KFW8N^2Β&?2{U_zvW֎s9PW@&_1f ۉ^;1boMN8,M((Wfpq,#`}8z7u"~ך* Wo^??E5*os^#|ЛmX\*HLh3̍ fHޞIr}ǓHf pyeucE3??<3 c)9B Y4Ș26Dk ʋ8U]daRy`p㓳K7?7*~?~_#[>y Ϧ@M´DN5}oKZӞ3S+sol+*Ry[OD2Kx4r/(ޫFۥ}p'mGIBv`ٵ ܎[A]Lx7LTG.ajY7nX!\XqHޖEmxwQw'VelU=Ǯ1rOތ&7we޹neqpzR }")Lγa&`W\n2\]L+(pxl< :q﯇7QedϞ}a>z|uuη=}1| `w%kI_!M^ϷO^Etny[.ΧџQ<.GmoOíw➁lk _=9PZ)wgT_u_bрUz_.%K&w2/S 2(,\UTp[m]22}j>:ʛj&զJ"6k>a%Zr3IM׿6Co#W0wfr=Nߜ!0\ `ewO.;,e{{XrbXpSķ{뛏 19Gyy=nubTh1rџꈸ6I5vc6W1B #kU?-`/JW oMāv7[㰭7fžssP!)Ȫͨ8 tC_ٝ1\8`iJc@j Ķa*q݊sDPaX]ź5J*"zlg*mԪ-bbKpMǧ OQG#& ]Z> ~Ъ5SP)aVT4&=lmv{$j֦tWwF1F#Cj;A }c~T[\yz<@aLHy& hVfH0 &8[QCq(mPt:A,yl,=\$ܗTdOF\O΅Xyόr,VP׷SZXd n|w ?+Uz AD.[E5ۅ,u^S"=j+ދ` ݅bF{#}:$d}"[tZ œ2wx%pcTUeK0aW1`OZS'&^.˼:y :8[ɴ?V9ey } V0E^Hz[0 ׁnC>3F(Z^Lq N?h~9z=}r+>b:g1wpT;/cɳgj>DRV`nܟr>̬XOnȳ ry됿h;]&u AXk@zLtq[9@b4)ϗ$Վ1d'-Mނ\*U9n5ax0ͅ|qUhO˦ bs>pL5|0ٳB,9Y䳄 64|sråcM.T!xZkv:SCX￞1,\>z<p} =:X5VZt8'w߻D7IzA<ҧxmݤ 7NM@;UP/Ka&yxħϧkMZ/ϾK=@{gzo!fž5\I/4Q{i9 a2,{d>_Obf_tW(7K*Ň˳i>\!cRL{7`ZBSZp-VR OTNs`UqHfP"Sj|/HCL/f NWV鿀GRD@ɼ\p,>>YH2&P~ښ\<2ENp Ưҟ@i\~bOABŞ~)] -"D>X#c30ĬL [bMŶtR-jT4@nlaWi a iwVSE V-d ,,$q`i6BksKDy:q){EǵغaX*!uMCbU.n'Zȑ'S`]03*dMYe[m1zw?nH_m"f>\|HcI3cݒ15= ',bWbVFI3B⢆拴չ_Ok+?@B;UR#!.W7aV'?B9KϴOSn̆bvW̧q:6CH y\tQ2ab/g&~l}q)yrjq3M'R2Ї`?Z&vO{wѓU)Gr|:o_E.roWi%DG`A| mˋc}RVV>A^ e^:ƳmvbL/\3 jӒO6e\KK9j?t g*W-c{X^M[㌵fN!*lwpi(?gdTUUd]̖koyw|v_@ٜC+?d ^^]^|v5;Dۋk!UYlCd\c!nDƆ(6Ʀ)XgߙmS0I6A CF8%8ZS+܊nI~I~e" O{l kwrݱ;/.N6O㫴]N|>]Oft01?<~ވCVutܺ`k'8nly޼y[]}r; S`Tk9EmXFT\oϚ[ߦE~0"fBVVo桲~W7SbV+MڊT۾|hbЯAӧsQM>ߒy8y#/?g_٠~y2ֶ֢[a:Bnů۶ڨ;²ew }*_O1 J6!M!"d4'E<.hMĒ\To$ց}2 ZvNk?C_9c x;.2ym JXDTk9( ~:fg_>/=H|Ƅ 1a=36pY)ĉL߸Cj=ɿNEvpq O'[%1Qu}h/=@F3Ϣ=9aYp?`NV:"'?ǸqcvƸ1n}[ևǪxCݑZL3Z 9F-SE#ˆԾYL_0R{| H(!p$s!&(ŨP\0*6ahg4YN#ػ"8vP}XߌKkY= 4gk=ʾyu|Q4|&&qTvAv돦X)H#8R=q~_'3ـ"ZmdMjScC›՛8Kb[~VexBq4<ScC#Van0Ϡ%|УmIXEhgkLX/t#<`if#%ύ 2?v6UڑݛGcV 5žx9hdIL+(#'d(3ay ˊweCTl4!=hY=r0)mr1^Ri*jӣug&3Uc9sX\B^x(XlzK> Kj7\V+) h5raQ\1R)1" hNah%L.+ޕ\<\/7i$4QH@cy(`&F P{[ R@Awg/y)pJpД ]ZaGU$ h4-ɢ]YnY`2&?*v_hS ^+'%=vc+z)yѼ˸zvWqrN$1!RTvF 6gxlt\;BЛ;٣,0zU@-mlX^B]Y>L[ȁ-4@_¦?K8E CEap˻rVDa$[~ 9XʈDH$CA-mlX^B]YYڂӧgwel2}n!Op p%.@$0 @rmp<]R+˵n *"Q`7 6~uv'BU =-<-*ޕ|,7;UrA8qג +ibg4>nc ru-C82Ql0fIA8@YhQRߢ+=C:\?Z2=ny9{uSC=n=P7+Ft]R$ KΐQ +%JDtVh*]-ٔeJLN"% lqMDyro؆RhRr}@{!KpxKC۶ VR5pp!Hޟ,IWNwˉeΦ\BoicѕjH)^!dXd N޴tI&+*ޑ w@p};].QhS΋4&;s[)W BZr=j t\>%Ȟ9NRJ0NĤK쑑#lNN&  BAftQL.'qBM kicPOS69*JbCJi#RR%ATj0xWr>$03DnNdX3'LEJ6o>(-+)1LMCc3Sb~PrL!M Th"p_;4gMh'&E)ኴ>A+ {|!`Kuv1rH`q07/R),!NB0Xޡ=]Yw؝/߭BhDBElr3\{JM,4weU'{K<Xл EAy-m 88Os=a/,Jw|/`0Y醶F%z8x*F^2`f=F+VW^f8BhY޼sRJ 40ഃ&iʷf=_ę_|ɵ Iy PC2`%W6*ŧ}~IsJ-_b~CՕ־YTƀ44?kN;'>0 XYtKGmlM륨xWkoQ4/V'XPd@󪣮!V ZBל)uh|?\];(e8S 95J@P^6L%ST+{tҤ:ΛNIu@2 g i*$-jkA?OdGKw&iJ]Gm!!OHTW7rוn7ݚ[ X ̖wVy5p\#06UpPY{/ySLoހqJɃJ_a_2$`Yecb=0d0C;b &n"iƉp3%(&7MSFM\sM >bV  "I)ɄxԾj4,#BQ!Ԉż9_,H5}5S"!j̋S[81\oȹ[jU}`?miW` 9HI6Z|uDEr1+k#A#,FQ\%TvLWScTĦQ%A#EYP<Ҧ͘FE6~Tkäg,9Ir TLIxB\(ʄiJ:hԳ9QCY. uj8F1K\|j ]QT-Ptwgӣw?pDqgpX9S0YJ]Ѩ8(,X3]M;EU8cjF80trRSJbm15oQFmOk؈eBl5Y=SG?VF:$di" 'ZKUɵ'>q4])u)KM$X\:s hN'e}Q6]m,7YfBԊY$n(}?I%0w'D03d`JtvtӣxFj8sl5wCOYu'EԘe< }m=TdV óRxU i~şk6!_aQyK>Vc rSؕɺ\/儛srr>gQ\2|YFι-ц.ۃ,ۃ 3=hQEI;h\TTʷU;!{dc#}eTހ+1Vڡwy7Sk0,m%GT=.(Opxȗ€cp fhs!g4.6pHrMi8mSۆ+_A^@PC2 su]7u)0kw@-t5ѵK+5ݦ)s$C-Ypd %aW {}I'Ix<^P;)bRDwI24Hf[w:V |B"OH pF1 %MSQOlNAB>:%!M3'tUY(H?)(JF9e-ZSvxZ+gWUj0FkMSvOчP 6Z 7B!uQ{ZpGp |(˹ͣz,A:'p+nX4`V1r qCY>bl]/dР>6 l Kr'R[si4$ֱ5l('d@ hj%6>h' GZ (MDFZfmv4U |(Ga+m\F5C! 1S`RP(bR,\ l~1G(bUɲ8X4D;I:hԉ(DP'A1Հ2VP,3ɷt>;B3ЦR8ޅNvfCF8| Sy`; L`(D%:JhtwW<}߁fJDmɀ!}< *.[yY>2.V±\(ij}Li2J"1e^=/o 7.0b]֣Cw$&*Orxmv SO7SPCy$SNe"f c|f3~xXQR2h+lulj~d:ryKxQ]nmĕ$j3x!Q\f)qgŁo2]GLT3E)J\%"&ЖWZeFKk$'eQ?}P>c8']==BڣMRFw4bD; KR^2F%R6 R'4fF]h D䮜Ȓ/{ >){XգM?-hr3\WfwqxGc]Cy!zUĂiLj {lib%|ݔY⫉51En|%^F/@ h2[LW[~5Z*a )6J23G:hԮ>c[ &Zr `SEc. /euvhшgN]:R0*[_gCV[mN@:<"<E҉ba$IpE:<3/f מfo qI =gG**<*fٍ/IaڔsBėlұU%`k+HrX&gK#E6˅ ^ A8yB0!^FNz=Z<ᱣNL$9KѨ尛%9Ͽ;^\$^ :LRR$z^<;zڏ2є8r"Hf2Όp4ɤt. ǎ\"ý (rmoä^ю Y1:ӼE_#V!/r1.&G}k2APTөǜoWI|Eh4 ёVͳk^AmjȽ->g?^,%W<_f ݼ|p9\mez(8= WdWS#uIPyDطٜH)یq.HghvK`nKia(1;lsMq]'sb'a㑿{XNĠ>TH3 $6m:8,a2;wQuz.~ZE->=<:h|SSo>De 's F3c@r8DLT[ Ѯ] %®/^qŎBo׺ؔ|Ѱ$bWٛKwÌj4bQÇߏ5m,r[s|/o&xa͠zvTX"sIR0jO~/5X)zuW77%4REET<6NpeF)F6Y.SCuLi)o_xL/?sOs(|KmqqX㍛՜댛-'4{ql px0i`r?gE_m|,&ecW{\x5U3C'MX-J7LeO;SՈ<_#r۬a/sʇXT]?qvͯoҔoW6iB!k{P3erβgaj~ ?ֳSҩZPS X^T~20)^,Oet}ߪ":{-z]m>\]Q&GC?(+o[8ۯi,m~kpٶN~k:1zkl6vmZw]0o(மo/d2 #]ln9Ax81 7`8m\X ZÂ_f'6 Zl(.5O/-6epT<2++1j!aEi'tE-c|}X=%ZAO,ۇ&Nۖ\8 WtLqfD*V<&ԥj);a02CwlG-Zfq=. QD}ɬHsG8:mu ixk %=tߍQ0j8AC}X42-N[Z 7޾7=VxO5D9Wca:ik<HHT= D+gAoo>`&قFJ#X6T"c M%lf,!bw %Sۉ3Z(.kW$@5SϷ=` "7T 0z x%UhW8r4& 0&"$sE$QJ0e 1EbN̼O >uRvШ% m'D̜pJڌ\gIF);ډ  Klԧ(c:l#0jvDM|~oGB8H.8?$rޡgGEQZrUCqdj' xbwMT׻jvvNN.Lj)L/\m"%BdrݷoYhx^j #\"$ڬB@E%Wq `} pBV $NK%GxʸuTPʼn6'?$ ^: 0[4F0lU#dЭ/ElX1.c3!],pR!;pHrA~o$[htQX3DWbEvi O/ ljTf^ʹ`Ы`.REcyeEWBO׹c.\S˪Pu !{t$]Qc2N g d\&5R1֓wҩ@OG6bnMnN(ѭEݺ` &㺠OdJU:8i4BOVsԷo$H<* O72isg@~L| CP'ۏB9*Ib+.$ y\GE?GOsRN jb,b91q}j{aht!9Ԡj%Q6.U P%au|F!14iDԞy$`=əj/$rC~W62߻^9S ud^wW~wSe|RKIb%[Rؤ<UA78}COqʝFl5%mGI'/wjJhq̼S>--4$Fcj8Єùm) oMQ0-PID-ޫD%xRUHYV$W1AKQ PBBJ1(ʵr4<^W 謸7*ǪDT2;#!28Ond;Zx dNGF/43 Բ^0Ƶ4m=&>b4P" /p\33jȶ֡~[htd1e,u RIH(4QaR46@W&/`nqEI5.T!> M˪^165+8r 0:WbOHMƌ̹;:4qQhR>n{-} د oE^jiAQUH`%*Z`MZBc'){=Os0}uʜΗ,lF_w:s7*蜪E8x<9r -$Yym.= !F|] Áxbو.K}nX#A`̓XCe?G״OCx-9~jZL}Q.`G-CT@Qg),ҍWςf\d 1bS17NrxȺ Yo{E>^v>a4ph)4.HxiHJR"h˜2|bW/>AG)+uʼ!@ J(K-}>ٮ^uf0Z4`;0{"B\6H?PcEA0n :@ŅwjQ[5+Zh kFO\Ad"iLe]m/!rn} H]aD yJfL-E]t2 -,eIR2"WRF"'$ B+mI ._>zz_ ݠ1dJWA2vȘ4&;BleU5[hA5hŽ 3Tn}8-4a4ƠRJk]8)e~GM )YcZR^DP1TQBFEUIVy4F8.l r5V"JJ!7 nH$N׮{o[8S-~\rrX@2( ({h/WL?BQ^E$5TK6<]A*t:K {lnZ84!- :mnmFeAG Nj6>Q D'Pg^qŘTu\%JV%2WI! :ud.@> $^!frqVzgTHbZ}ɤὩT_o\k4MUޝm&}yZU|ɓgN~|X$][H7$OFkn b^1f-h'N3Y-ң*IԋY~zQ~Pa@opY.5M}Ah?2Q n89-1nn? DI1WWRҒ*J^ `|uaSد cEuX+^q 8 j p{{4^9o"@z6 N+[v{,7c;.ПBФk; TMiBewEFM⢏H"/sFF *rOĈ%#֔$ wKմn&}$"d5A8zOpFXR'I9g8^3(Gq3x%B r=`pmIc0 AÔxh >diQ(<1I_*ێ" ^P1Q/^ ;@)-̇Y߼$yXUqeHQ^Nnmz=-H Nh=ZꝟMok_䑩~t r6W+A6˝0VS7 oX[Z߾ )dY%&l_6rs#P(ˬNd%u[!,}C[q%z^k)HQFEL8QbOUO`^tX-c\RL;_=ZhhBQ *tmƑ腯}%k[A`l9m:˛e}罟. q%P'CA1&^Pzm]l-,ӌ (⤣10x`c\ce PlNj?1ŁxkzNл9]]!>XY7M]7?~7iObzLM4Jt>/d6]֓j#D,.|ɭ9G2 `ir6pJCvP@v mvu dw|Y*sm٨8L(^DH V2&Mlfk6`%Т1%rIu'zCn]1:h zO6"NWWaJWQʁΟU`lW90#[4H.bEߣU€)BIhEQTЧqcؿb&&}Zʏ>-_c&ozwmLt~gQUzݽ":Pr) YU_0e_"EoW<2C.q|;oI[q)I/\BX1^:v1МWG%fӣfݖ"4Okm#JؙD~1iq\ms~X|r7 ?X))"+H.=)IUܤN5>mWVeô m\oכCO7?f^iݥiA?>İ Ypח{^?&Af|hxf~YeOnNz]7}|ϟl_>E^o5ӴLJ"/:$%W{>|X7#M>f2N'/T&ɧF4䗓N'^_Cnnd"$}KW$?7ɿk{pz I ٲe$[˧'^i`S|XA>b F|᯾\/.Oټ[kŻݪZ2vx>5l:H6}YwUzPO|֒^ܭ(X-˜w(7kXQQ;oe~/;}!Um+fLt^wD,'yoT>9rGf9c80y›_tݲ7du$O5LR⼂sR2Gկ[t@=,BށD06SVpF. :CBFκ:Ʋc3ӤI┺?JJhTcM$29OsF}x>|}5Bޡ0~l;N9H3&l߻X o%ޣa&i3Q9#X$^IE:-_nxfa435ؠqY;=e|ɷ0^uRf03s{f OK%` ؗ|,&M6.6#W&l|xKD3 z7c=&Ӟ۞uyC,PzùfCO}{톻jlI4$Ԯz|<\jb測_qԍx8$]H/mkPρZoŖsVo7Baa/0AN.ii3ReEH;ҷ!i;qaiPLjuSIuH92mTfίnrcww O[u}G]W[o3ۏF݅.75/zSqАLXw8k3t<ME%o7I遬BTy$*uCn!Ԥ d-?ǧOט"|TKpTo]H_=.Y˷Kcϧg㟘ן{Գ|wvcۇƎ],(1X ﵹnS #]({q_?ٜ+Lim?G}±7=FG>Dkdnm˧a}eVU.Jy,G`ȀE)'QPTWoy.!I2 g:JNOR$Dt!HL3:}K#al&kLI!c 3)P%$M͊ۧMfvYfշbrGqT۟3Ʌ[?.Oc`cz/z*Ruu,,Bg$M"EFfR}'ۅno/| yK^6|>㼽Îvt>F{} Ξ6XzChQJK݇C%Y.`d(@28&wD %ҫ6&z\K<-7\#z~Z=Z=/O3U8=~03g(ixLzx_m\$[翾._ig#lv5]?~}eSxx#wŭG1GXy)Xo\-O`MÑj٭ \UQ^WUi'V'pjj(~9xe8(@Eؕସö66}ÀlٍO\1:?Uu[ !ݥ3TʎhAp6v2k p F.Q-9Z}?tXq(+u||Dd\JаU.|AD[^}mhPF>gaD: fOGdTGF(y؈򡁐y8 QU;ǿ#.s˖tk"$N5NDk+fm?O(ba"W~gGdmS"2iIBҙ0sK )pﵾm 9>prq`= NfƸ7`[w\Ko{}W+'g(4ڨ]nn64W`Lqv 4 >9yn]N!Dƭ $ņc#9jRuY'.NEi-;lV{ fОÎ8(Băuо: *6](2b$DbzܷS\Htqqy\a %G_` Lo~Mߤ0{,ܺ8 *V4Rǯ +_dp7_FT?cdTɪP)MM4-A?mlFv+<L| !*pGr(3~7jdT9F 8hrf$HFe.K[̂ZnehJ#$NMfr >8"hկ&v2 'zwtFϱe9I./ G.o]Tt]$eЭ҉Z냕J;ɜV(28BNBF4yKR) ,6Skl)Mi7۴Ƽ`|9* \8b:cq)~_HF}Ѡrj^xvpEyb c[> ԺčK}/oG812*cyTI`\jLq"r@FepJmIyLEEܷˏ/C4I9tr6Am캔"eBP+'ȦGUIR&M- CM?Q(}9a=ԽAP35_Ɉ(l@FepDypA_HcEm)PO֖1+CߩojD-^8bl@Q>"i-sɼL;C"J+QæpTPЖpgԍAY81xm>_B^c/˻+0i:vu,ka.WO_Q[_<Q9)oyJidTD9tSٵ,pߘ6I7}k<Tt֜'e0B=, ։Q)JM, ڍY(eo9gQ҉+bG\+}eQD%c (Z4/oGyΜgiZ|nP'.W3T戙tcQydT S˃E)!h]Z[.KO3b҅AW Azq;ig,Dn֛շC&%8u(*͏ft3CGjC+DC7.>_ ] S".IXyfrSsW1 ׀1qy=T QS5{zH6`Qؤ7t_v׫MC^1lDfIff? ĚdTGO$،@Ȩ $3fXZX&+&M*`IKm{_xiS$ ?`#CX53V4DPm"L4lfL4 Sc7mCɔQb` ;߭f!PޱʰLz)?1 28\LRbqnE)Ձ2=Zh 99PmOk9W#QV2*Sbbn6 S=]ʬ Sp Ȩ j1ƶ7omyQ2L而Q 2Q)7m.^.?f=,n/ Gz;>#=1,2>s ],Uh&lʀHZRZ-VmDUFu7'{xݛ4CR^]WzDz0O倌h=:*x\J;G欳mTQr&!\TJFqX;[xy$'uYjKvwrD.H ~NFep $'ƧvεQYMmaCipGO2`Nʁ+|RTQ{M]Qf&1Bc(oDi&γ7%F5nuKlTX*U LyvDb2*&DQ7MZVǘksLŀ(= 28^?HoEu*Q[e(ށ`m6γl1+Њi 12|eGd^ual;Äbހ87qaUn=efINڲ&LduNFmp\lqu3o߉Vs'V2T]a[*1S*âk#vWQ=I@")Pe.wF4i1v6|!b2*S&\LWƥB941 e@F]p8mOCF _]#LSYv__{y&Nn{^,Hͦ&{'tGdj2*o2wp-۪(Hc\3B:sF70@6| $ql>}KAxx~ү@`!9Fnم)y]rXfl>:- V XQfa:}~Wwq?6nL ~TOC_ˡuMSpSJ&Q ^}U`7ij^Tu+_~U*r¥ŀ7 GvN۩)źH%;=3*G$l*}TłLtj bǐ4Fٕ~q>8h7Ue$UU素e| k)CÇ~EPs|g G ;n:2,"1rbrX+?B"ɪ?1^YHSkȯOC`B_TVbX|7ht}Hokِ鄨> E6rIQ}4%˲O W+|yrx_jNR ׸4vњ՜0qZXTh4 3ugz^,K.(!FD X!dѮ*Υ^h.=_V,5l\?7LI`X8Jh9?_hۡ hk>^u}o)Œ@R6v@4AΌF!"+c޲ =soE Q{%R-*k䈘&D-C?~<=ǨLyM͢$@/)iIa%nӤ\3V}{QƮgۈ0x/^Gr-KTs^"y LqY̙C:<Nc`Ձȧ"2Ev1J\(wp)BSϘ8P 1b"iɽJ`呰r/=r|Xտ]Ip_@[エO>x,zգ*WŰUp aHLR@bsI.o[z܃>-:zۯ2W-kh`DE%ɤY*s ^ֽo7o"6@Mʶ]pPxv. - u$o׀Xkv7cPў!cu,K `-cXk,apRb&y$ ViǬQFYJy>ek^+ٲz6:&-r{m 5('J.[m0r7_NjhӫNv ٨A9<]Tz pqFalS& 1xEB BL{%K:,xAFt +C\0׆xʶ!&hnD.HQXDB C:A%[XXu\[d<8Lu~WARA")J QPt"Q:IYUN}^K8% wP )A29GS(km,erܓn \뻷R~RI g;4LaQ&ixarx^$_c378|f`]d6".>PP}>f/<9[`:*p>{>eYv17L*Bu==>{@_vUƶfɳɸrNbOʏZjmFVتF;|"IKufc옆V웸pZާBi*'ƅo0гkM+1>T>URxgڲV_= b`PN$T#aFBU;RL]ϔeϽrjF:_[T 85iWTa8+rgȡ;[µZ1qlT8zb*UQ&rlT&^?Ԃ~?4%(4`,Y@ >~i[t83 W l$~ip&kf=O ?=tFl* ϥmSY03`6ˑ f,'VhFX:Y#%Fk͏4'Cown-M^_Zj{N߰#Tkm]j#drgu 61X-*.8'U7VcRFr<"$xͭ0Od߽^Z@Ć؀HJ=9L;Q,M}΋wlm5Q.Y*ՠٵ7e)eŵ쌋P4pk 4;iFj؝)R[LzOq5 c\Om?ȹ.|BD!p|bR)%bX ƙARLP'*G!}TOC`t1Wa#A29HysQ vFOB]CZ|?^{:|w%>[{wJeVCa-{}qAH w}51aMY Ȅ;:o_7R7 6>d:NK> dOvO?0J}^9q^[{_$o:KY2Βu4p+I hK+U0MBDI$qiCI[͛jadN('X64jȕ1mPieΪ6+nCt[ܡqC zWn e40O{rD8n\R߶+$T@+$"~ I4Kp =M>^'E=y N ,YydcOL,^JJR`,%bbu],xGI4 %:Id9'ăsNvW<d?RE4;]J/ޖoY'OE:䑧aHnEO*^P4\-w@XB^K_-uzsS^jN@ex,Ɲz+#nϬ}%{đ/HQ!E0C(t`Zji:Hrטj PX0z|ޅ(,+v23Е}`LyeZ(ZZ'dV#>#/WAvO Kmx9M\؆p$~l; # X 8Kag),,e526Rh!LYyDy!1rNH#CDzB HtdC01!%0ܱt|%Yr|%YS ޓwWa1Z+0J?Gvn# l_b"Fs2ܘ\a!(rL"麻vG6| LX3MN F/8`6Jk&H- +6= ,O`9c88/LZܣ*P j4!`2#$;Y">e$2a~#YyC[=I)^/4hx [\<B=~-*3}9Xdܹf׽I\9!s"1gJ2rud 04$9\Ipu'VlprH)쫗 ))^z4?=cp,ˌPSh–+7+7ʍʍʍʍfߝ77<%]"vػD]".{KމvK-]"MĚvû].{x=eﲇwß2Ø)JeBJ].{b}Jr0In Q.NMZG%n4늲i2#a@ZZywD|?&pH',q+5)@(`@0@\d=QZ1撥4F%'T˴GפQYo[ M\u) bfphMFˬZ(WOsFPp|hڛh0ت_ӿ"܁j k'*lE>(%ѰZHzNs;ɄTz5 Eڟ5DN|J5B@-d($ɔ'b#j낧a-Ԯ"@9#T)1DsDCѠ=pAyzAFƷ31) lm,c*\>gՇiFbsuԵs,?jWx6-+BϪ,.6??w7UR4[Vn/p= ƃ!|q{hC,7F~.ƣ&|j(y9EsX/-܅ZAfgq:Y5#G'`ho"l]h:vpg*pLʞwQ[8Z %|Z^58>jdp"6>LsC nPdO965&s8m>R)^7ŸY^$fo|7W3Z5_py3cUMYn94(Mj"KK- J8/WVp%DX{Dpx=6hnN*"xyw-dCMBwy g.D0.([ͭěO<@k2Bs~TΩ[%/b{^<>=n<|Zb\1-dGD-V^ G|o{CxaKHm7sSОy9zxq(oh ٝ}~bhp$hpö㰜 aW,o4qqS6 >_y @EE :1|k0{_9ogH9PBKk pZ)CTQg)po:"h%TIn<ڹm~[^Uvyȕk?#̟7>?Bi[$t/Hl+c[VHlqM{"bh4D75r…|T#n {.аR2R2 (%JɀR2 (]JTOGVJd@)PJd@)PJc3VJ{)PJR2 (%^#R &* a0_ Ff!R;a_a9’+:khp+pk2BXwCu|P92F$nєWQ6 j)`0 Ԅ)^DG\(_k"rZS/,uJ ײ0ֿGdȿ~orSMa{:O@U \ L9|(@$6pEE)f4J j-մ]r!I|&76T8g<T&L-Mђ(&j;ɅnCxJI^-!輻c]n.7ߏN#  Áf2@(`@,@\d=QZ1撥4F%'T˴GׄQemݢU L+`__7w 28P\fB*xB6{EH): [~uCğ;[M:& +@@gIzS`69?OHSJ\D!i-PK Ya! I2e"Ze8H>pS`0܍EjWnneepع`hP^JIX>Ls0?0H>̡*~oo5+pj<g?Nջ@Vp_)buUo+77K !Go|0ExDV0Jb`X/-EqӢ<4;uc.ȱ $Z۴z䇲]7=Yq 6lNO Siwt0Xk_{=zִgpm8*-5CU@բrljL:ZqNy.3֋ϖTY'v`=uE}Ձ~Э6_x) o$QuZ#Q\#YYǮq:~׺zwJl0m?šc#5:iAː\pN }"|&%Œ3,Q}::+Α2H ]uBĴ jdyYf^hu|a͌V5y RA>*z7,b6/$(d^Yw:xOAc{Ki1۠%9zfKpU|bNRܵo*Rr&?, \a]P[ 7TadV9&I7%JZĶpGxV^ύw7PU﮿Z"?]%3^Ȟ=¯sVlhD[ȚҚ螠=AV_ZIa>cXpX  %lj~Ŗmɖ%k8d"UŧT}Nyg{Ho?+{h_‡ ׇ\aNg2i)(}Ӈn.3_VŸ_y'fO"(/I)`-Qɫ4j, -dYRؑQ t&i#wE]2FoL! oW6vK--fHĘyY]Х4ZFmŁIާl'~I_cqMig{3fcك6wX8eO@o嚺C\S^4TB.Ʉ-:h].h.6VbfW՚* hMWWP4)c UrQʾX7vȻIgw &Ep* 3b lqEf9_dM (e3#lЄn=N"Z֣ŀR׵̮(^E:)e%/gl/!e[-_QWV_\uǯb~ siᑯ8[8d?~daPTP Tr[ʎllRdmȃgZvB giF@Ap`a?E`؜:d%S`Y[ &Tx_k@yKA'u C@E+% CpKM ׇ"sCޱl8)IrҠ\\FP 1T&XD3;*B45ԊW}h~n~gɢ؍P-8AOE1ʷPLbz0zE*0S%H !1'CQ&c P=X/.oE;MC[ ~Nf PMލ_r=y_+Ivxx<9-7iǣHŷFf)?oڏOyLϯzg/>ra;9y[$XBa_ HY'iu>w9o5P9ིV(0$BzvCfW_\&"burI|Uq󝠽OV>*[jb>+6o^9壛g"t )^v뇍R->ۇ}۸f~=^Dnj(аn ap .D7ʣ䇶JMzfA'Ϫ1%4$SmtM| ˂$)*(0po<c8ޫrÃ|~~Mh wyVj_|֘f 0!mYMyQ9z;ef>iXTx3qf by9KmovH߬WoŬn^H_/ֽK䛱H_g-i;l`wGHRr>>5 J]E$ ұ+ `/m*⣀%bowf5D$|ڗ ~k$V>_B5k0m|,3pn4P#P NB[ۇN9/iޅguYNU!yѸ":T`t|*]}I!5'y }R 7X+NX.v#{X@'{׫za{6#iĩ 1bw5꒩6CW CJ.Z @a7y XUoGM!:~g8jV Վ 9|Љrp:E%VXel|xpx}=*|Ez ,*A0KV L ơwX $ |J礑 MVlw:T⛭a5F<:VGcFM  8)S"CJHCu}nLɼy0\{눿QpF\HYtJj]ZDA!j lW].䂒5 ;G@K7ylҾO]kbи,š QDRi*j/3 3 6EV6))FQ?twe~z($5:wďҲW|?!R|<1Z)G?08mcnU#ێ]> ±(L_p^\~v"zIa|ZUV_?noQdW|~$.f#hiEr!? Ug6'vf_~qG3֙d: ?DO]}2K􆷎ߛiTyY*̼;G?sVnJoP2c@"85@Cӑ"YCPw~#8UJ tnYȭE塀'^ Kr[ Rk{k?FH$GZyp^U-L&j&1w_ak*ߡ]&_uuem/irW#q!f-uՇsWWNIwS([,h?qKQu[~z;zN 4M/Ggou/'睥X"Ng}2Dmh.K,F@uBD&ۆ)X=I|)4FC(&OFk+tR ;Up Sf.E,:!5)Q%548 ?iuu{ҽN[BfP˶HZcoPo@b=4W1XlOAC\™Α1aD#LlU}ϓl^n̆c&dt/&*`bHC1G&LDcҾ%h.d@F' 6J"}X`{~Py_Pm>crZx+` 4-f:iʣ7j9ARSUsBrTߗ#AX7Ms܎QͤgG*A2C c1;)0F%"XyP=xVj ɂE@8tPȢl!+XΚz4Ru AxX"[J :I6BFx*ZA@-I\Be% o#5REz ;c"'pRBYa-夥AKѹb$U%F3;*B45ԊW}h~n~WoEY=e$TA:i0*P6d-K<'bFC*^opUd@7x**S ]4`eWXI3Ax,XXJ62PC۴m?(`"?F^S> "mv懲,zB{):e"(:"huf_c0 T;! dV(,P݈/K%P56!%(i=|#.vD'WC$*S:yk;DA ḥZ6 F3V([lT'^y&goϻ&. h|"ذ ||yh`D* =Qɇ/kq1G7?x}:A׳h/nБ֩CGwᅣQ.Wgfpi)(5pюQ胒DP^DYkH^%r% +Y"[djPp$ cJhrIZv-81 j.<"ISTPJpad97CiߤWHct+~DN@ȌNry. ϞxӟtIio:HO_6-}Ra7bg6kmxQIG `ڋ&՗kWgVumL#eGT 5- ؝cJ5PTha+zJ \ G__;z!ʃQc] տymuº[b|h ߚ9t__v/幰V8A?Nu_w}IN9;W_<(2\*M/IO V'`+ _~ui= Ϊoݍ'St_ͦ8.2ďVQo>MtHPħ/(뢦8[_U_ey$#ii[ЭZK5C]KwVg;(`vWգtM1ׯ_e%]Q %n:YOٝxս,+eQOE( G~7QqDOI{xx=lQDh ܐC8izJ0Z,6$x?l T+ͧ塏|0sՖaWe lX쭳a92PlW9}8dѓ8YOdC{U'Xy(KQ-_ܩ_x2dul-~ ;6B nc#0l~<0kDu Tjbo{xX㞝]ܯϕcN!_G>xHVb{冝Cz'zDt57ec7KӪBWmMfIrpj?1u`\vTM`{/b4OvQꎊcu[-q&YBOW:SX=K#~ [oZy1- .>96Q`GC' vMc}=eE[4].݁PA U0 \W}n[lhY봵>2\ ޠ{׻B^~OO>OuwpjIn?O-;$/1J=g%8AUQZZ>+nG>Vv-jW TynZ]݅]nL"%E04h"u'!`^%@7w.Z5.v><< H%)5fΎ GWHh h <䨌 )F+۔?.x]sAtUC(eP*ybtOt+$ϭ'gFK-R!+Kh<|DE ci+1 GYFB R0|0A@ Hw2ZFWHh^1|Ҍe.wxcJWJӆjk$4Xk4$m&*'ՂETDF0FB  g4Y^r$glIio M֚qjcz`HFzp$ ?c0NfƸ7#>_# xZ6O\V8U! #ȟZgYc5!T,C9>BB 0cO@KN,H҅ 9<ԵApԣ.ZSpV1G_!x[)u %Yi@v"4Q:BE1z[#xT|^J28%HV|DҦɌ|&\47 DRUڴcirKj$)SG43X)ld*u?. x<2", _3 Ӓ)HALc%oln8Q)Is!WU-UT%|BB ֚Wy9䥔#? -K5#ܾFB 7DeɒhId c9579xKI E)Fk1%TfQye).F^R5 HP%)Hך 'LiHh5y]T򒤌AJ`1\AIKvxCk$m98Ԗ&%=JOH -aFK[/RҌ9HQBY)`dd~+~!U(8Z\.n, dL`P֣1Q5+C{d7uE \C5r``uzZ>LE3 HZLX -kڃg2ALP֕Ӟ =~\B 62$| 'k=3>\#xE4 |T-b9ڰhRo@gWRc}qοۖJ75%<8K=E V1H*7b= F| j/?fЎG 塒G{[ $HzG^ UxG~wJmGRRF{Ï)yݷ)AaeRIOr~Fʈ/dYq#>1Acϕ;Y`Q 40?oj*,g¥f};x5cF,ӧ񞿔^͌lKiVrF^IR0 b1[f7GlXEUYT |$iZ6 0VDZa9veZ &ZW)}ڍW)rWjj5^m8*9Qa6uG&ɔ|6(\)9Ҥ˲ %kn l:3/A7KM0Q[SjBLڗr\'\sA:L"}x{6x/CPW*uP i;N1]$;¾'yCvMsRq~]P^D繖 S* '@a~tJ\VF3[ސ3yfK.KAݸ<.lqHʝF)7Qegv&R)2)U٣S e Yr-1򬬤6[c w5N~gf.~PpQ%N)10 (kLQCprY)Q=@Kx.PWOlDrZ if9k$/Dp,D3;)}ZͨV̞Y+JgC>Z^VXdR@#HCʡ\o$M6W#F_-)MT~-`0ƺ~}_̒g(֋`r9/bAQxLZJS)"2 0XCꑬaYnEv AE539߻H;AoUb.o(-81PU*aRc? K}sҡHًtb^o6u!|ܜS&HQgY\m^0.6PXMڤ{ աKl 6?"|]yo#ב*bHC8@v8pD6% xt-{9;UzUoK>jHe%nMR>x8AnpMUx `f5E T>|7ƹŎmwY_v_b3 l]A{cN86_m<6B䌂|JYɃd#"SwG'G^1c)vYj#-J)FaL\Ema"{&CDT)RR%nE  8Ja808jl85s9H4 ښhcu)gl~ؘ&'xSfӚHL< ({܉H VQ>[HdǼ0]A3Ma4TǓ c8k(v)}s"F FxbC)!Ԁcc!)uPlDY1\<8ʃ.{RѵJеx4@#G2 (`# A?QP .bp" NLF1"JqTSM^70Ka5.k%GƦ1bcُABgkJbro|Yslo yD{aK_ >gPs|tDS 0s7%|@\30;&l"B)YM<^ Ǘ W佗ciT(99UJH_}{[t WV UgP9"-1eKl?} J/d | Kj4EMkXB o7{h2LcPr+AA[Г֏}1OسjnÃś]h.Y ږfǟh׼yʶCZ'BZNq@6vUI79mA̚U>)Pj?qVymu^KTq4`|mO.uVD&M0L۴g]\#B2QRKUq(q4qfQGnbB[\ SQNiއ_9mQn^}lbv@=Ūh* FnnaXdXۥEǸkB)h `J!Ynu '1beP݅mMXֶ)]5\(wp)BSϘ8P=x:FY^1Tk%if2nS$a[~Ed}}}LÊasI6_o54|wDRn ,\.r$f0&]J25ɘ:k5QQ#qn:*R[lI]MJ ^j#y oU< tD+,` Dس`" ,5Uyn=7Z (WKM)pZFe)A@kYNDsK:?^^'O# TkQ)eͭ5\JA쐾h XwmY/wdBG܀gXJ[hS& 1PxEB BL{%K@| !aC.p$#)Z`ɜRaf@,hK`5!jVK$1JE *t 9ĭ4XCRJn$ڏ0B6F]L5 Q3#"Q-qbscfS(xsWؤs6P9v er_>aAF)xp=x 0Y>%g׫(e{ϳf'_Բ@CJ*<5ଫc$EWY?}xJ>l$]T ~΍K}"l9o&+$in,OѢ} ̽E$t&/]/> 1LHhO/%?x =:t"&0& hx_J?Yf؛޿notD0 nP+ ?//([ ؇M"IWb=O~>]szs{bUΪTh>;{jvˆ9zqFbf99 hK#kqD btZH#@p2P;vt3hn})tIFb8Y$XRz*E,1cxoНosO$ЇZ@Ěj?_yAMrXsšˣʈIۡ 8M`Ŝbh6_m<6h"g%SH8,%& }O4=(&䐎j+jfR+P|;,5?F[@Jq74R0qEA829W@IX2@{&CDfX{))I2nbi1g;Ja80Ljl8a"ˑ<' jkV? ݼί5ݶ~ rr7[mibBbo#; Lzf:-8`3&;myxw~9ٯǿ c+YCkKmgߜȾasݚ}RB^>d]f_&MDS+eR1EY&zGڥ3n^:!-u% C!fH  `g)Oc.*51V`37+ 6prvVV:Xi|5?\0s,N+lHE>Ӓ)E 0;ug灲N~U Xʉ^P:8*K~T+g[Z@P[8$K]0 p)?!D!e!x0k5f,`ZFL&Z)%*@y^aM >ZĊ; S`sqlR@ryaɳp(۷'vO7=~mڕ4gK:V{;],dgө{0nRJ̘Yݍ.ٓ b`wfiZުR_lʛ顨HR5sC-ڛ7P'[D%Gpc"c=.f5tۼr`%%; $TD-ٱo s\ԫ1&IL@*ﭐrvx}$/7( U}=s޸p3wиLo%Iؽ']q%/ř }ĵBx#' IMRz,^v!eEO> IKB9UaaR"嚠J5vۼ9]x-g5kZ|o싊K%bdF/cL.l׮E9tU<ӧRqiS0T{rg/df8  F#Lx$N0+D{-"#a:0- 9a':-{EY1 3 r]R;d#6Hc-Kc]i@ QzHLdu4(&TI pZvqZimrEsX L`E}})}x41%]5zS AH7ie@rWSjtk5F.>V'/ЎDa3{GOQYXBoT} 9qQڼTH:u$WS 5JSm\n蛃 '~o o7A 9mro~2䖬soL7H+$䤎GvSb"JJ)_\C+ԡ[)>YKyٍRGJ)/m_{=є W&/tK)u\*f|;M-Wo2#[sΝ+{}lʆ?&qC"e ,p0JeK): L)X/Xb>9zmMrKD5Ӂ= \gaQQ"Q@1IȒa"h4Hu0-tlU,I3iEFTC!KJ¥, D'Rȉ5e9km9;Y%W1u1Fŷ_Bv"s Фx'E2`9Z$9s ;gXL$!_E/mKE~eQ'f;@ Dkr*>%8B (w:o1JY#!=j j؟~o⠉k뭈 ;jND-.ZY@4 t X8ZQ86[Uf0ϽOO1C-8!ήYp!?ꤨwf|3|\w~b%"FhIKi #XEeF9b"ci ^ڋnK ͎EZc1·?v|fh!֊8\B1w Nև[5+t!JmY:1z;ˆasSKHY}\\wt^vl]Dq}\*!oJ-ѨWUZM[d:ޢDa9[\Q:ޣ^A"ZO[5lP /x7O-dq Jc.(IEDʚm OCoM{v hf혢c|j\l@[tY==W )AXB_YeEDr pAb"&?ygLDv0nxO5D=|%)кuOpSO|c'#dž?+AKTGLk6+>Kq(~*IȗSgej5+=s:§NPysz>%~h6">_&K$%%Pq "zygbLNEKID(!1 I"RsnȜfnԮ s5pF$sy49 ! EtE`|mV&Y)k~EX=ӚQ2?7j(e,Y-/Yr6x7QfF`XX\ ;L1 GNLJ<ȣ0!i:r^<ʧ^. \UFG5d!{NtQ,` o9nc;2Lݤ '?;WU.}Kx_Kx׹\W7Ly1ZNHB[P N%J+wO?mtg8tUC*7D  }ÕgUʉI&tH6/8ptǁo"5%ܑ.E ?Obv2 _kpaT*ZO7/.X(Svl˟~֯k2i{8Mn??гoxHM1v Cٸ/Ηn㜷OZ5eVč$OZVebEH5 y,&1_är8pZs]{aNؿSޕCXxjzm@)A4xÀDɋ1%JɶTJN<(|_9䑶TY[QemRe7W1#RlR^( ¼&-rR1LYXAD-ľCi{J),h_)6]E9TTo(ݗfyPEX" # qcPy.I&,1U[x ޞrv({W8O3@0f 0ؗٞ}.4tJԒ\]_$%S,'%FwۢɌ/_؀$CD(t Qr˨&". Z19νÌUgENMp 9`Ť%B"D)59bHETB@4e=o} N .ีP@l`% ̑rjMόK] |(_/'WD4`1 )ŃX2b!A=鐟3i)jeZn1%'}t]v'qʸ3j*"Ml0NaIVѦ%#‘eḩl⃜$&RGqфPT2e J1"` K0rԘ-_FƖEkfPV/|믑| BS(zWB9u*h`AH(T1Bи;ss>՟=V`p˭xG6 9;pf?by>X↶q(dM\R%utò ?p (u +(%sv"ʄY,Q` vSXp\?鏧xj#9xd[C `F9][PwSQ%ε%o2 U cJ#6+^ wE%KfvنsmB|8[\XV0^0I򻧓f]@y7g \$x%.c#BБχBR 2J8"_2@ Cvs`ƫB y _);> \vˎ LeC4@#O7$,EIJI, \4lx@HfE]y8! AalfpiNP[M̺ࣇW\!9\wK"_HEOkWŸMn' ָb:|`O#z!Uxd!R&R/5eDDL `R ^pH~3޾1S69tkLs Ž?շ~A' ]tU]SQ_B?e3q1VYEЉ(GleFARm+Q1,%! j C&\c:n艕$+UX}Y@XvÁ8^c y])()=(hTg(GwJ 5)k5QQ{Zlk1>Y'x3?ܐI㇠' H , $(% Y0RD -\ǰXb{<\c4{ ȰMH Lc#I )1b6acҭ|K0N#J=S&m9O1jVKBbϑ8 (Tr['h$r ˪kK16O!󟿌]o^!#.J&u PP1o"Q-qbpwtΉ0O B-R:Wؤ"9GS(49 }"p~0LgMuBd'_Rksi<-(LJH%e-Wb$ ? tA|"_LF$Xjo:*pyןf_ l0\5.&z1gD,ϰ`ppo,ѩH-op?f4XR}XP+ ?u=bdP>mIpΓDv{) jSwgdG/wlsOwa \0X;P& %R=E5\2p L6_:Ƌ2>1ܨ"zOv8L=5VQ^Q_4V͓pUmK_x4Lo. XIV)e-_Ptzeݮ cɦ6`33dQGrLYw W/u< WwF՘a|UaPgۗ*[8 p[wmDq8#1fY`rbf4@Mt`4, ]< ZdI\7Nd#1C¬sK%^EzxݔT",1co}9K蚲k:'yjN;b:JH$ۧx}h?g((R`[Luy'Buk5<498v83"s0@J$] iq{b'+3կ▱Ń*r7ł9 ˭.":F ~54h`k.PHx3jg>OLj9+j$GD{-0$fﴃ 1{]z gb,g^+n+S9ikrP:Pc$U2~uɸ,(a Uxv6Z1|*`X/j/rAcɇ/?<~97 I^Z@zsd GVxa"J*l0ź?/ t}yۿ{)r-(Fz}wiDトF}Ԃ; &/40mc% Ѧ|й(;i(v9'؆gϏifUuf6dwZ'Z]N'[$q1Lg/t]֑?{Ŏ>z&`rX&ɻm VoI^.Gu%7wZ]tƽP&ˎ(zsyVFim WHaN dlC]'ݶӚ$t+ yfM& i mҪd-7tY;(NEemCvuB. HM ;wZ/T^8mF۸z Ll]:'Il +G $4VBlS)<@h ͉)bL( VP}%g }*Y>]}2H+fHj({څ?s~Y:RNbXXHvwfF{wfC67E֟j'ޏGkQ:%bb>jK ^Ȍ ^`A Ɯ^>{׉޵Qzs:3=ò8e6X'0|ѯo|4 j-$ TH'Kґ0хT9ր_]yXBg `*lu4Hp&$!q"c}9@A,mR,>|L+ jbKFbr$S, A9p0I%>'i{Zii"IӻeUU%uR Oy^)>LWN_ ?8֔2l!²-`Rb S/CԠ:>E97:8=y5U;,5ɀryC#E 0C1@ȹ8co4XHǞ@hinbi1g;JaظLWq$klr[2am7Y?~YcK9+r^?XJ1ʉ CϝDZ3a #mfq}߅~lBKphPll+7KA[oy?*fwP+R[0EEZdMԻK|uCZ}d [cR;ĬW逰  <Xe{F<{EQhvsv jmo=ҺT rU\0sX̦tAauE$"_hY"M=kmwQVaUªw5pԜGQ{R] RIN0nؽ= `pGT tͨ6V gp V,tztvO̺4D!O ך쓾M~Wn-:Lc-'o;ȦwI󐓇$nUϛ#Z69ة//%=sg;s#=t_ XS\)^COlƴǤOݽrl;i+=67ma=>iͱ%/ݯr7m+ګ:e*\[6l ?ą"X駏[k`X }MxgF}VyL6 (u~_ĚW\a,`0xsQ+|Jcm裬b>Ӌpcn&0L; r`?$XC`\E̩ "Ha9 ; ^yLa9\"ZBT(P{ͯE5>w?H<# ,Kdӆd ;K_9Ylv*/Wܭ8G!"U+"j*R\ =FJ͘SY-Z9#i"\(˄[ѫ$D8Dfͭ$G~[ eȃ "``:?C1fLEAF$H~$1G@ܩ${8(PtiŒ[ Uk<4Q8@ 8i4Jc} IBDqDj pE OYE'`>W>|wVG&I\ԈnAA,*΅~P,s b"H4Ghr͉cL( ݺ!2Nx,#66# !6`03 {-ۑB l 5su;;5".q<oo<3! tZ`JGy,sa:rGhE(Ȭ BthޕӶ/yɆ&Տ9t*<4IάԹ^[9J^_%5!nrwOx>WT^箫0h6=ÊgLMűF}dW [&YT(?~b0VXgY8 A,9T{[RDB$$[aī̒ \.%6!`B%pZTFY p-1KfI>qwVV^̕;s3/]K<VR3ms}}Rz2p %ˋ3o~ErP4.nzfZVZ4w.\&sBv8ndXmS )(`G@LP G\\rq$Wxnlpk X`yE@@$.:RD2'A@b,A"YI$d+C~gXط:S k&hnD.H(Q *A˭4XeUȵ8N9ӱI *gϵ55$l$(kf*BAGl"Q-qbuvar3ߔ , ޹<S^ads6Pi9.ŗ 61¤^?Ɠ |Bw_|6M!4Uoi|:Lx~~,g7;.g?.4Bh[aY篚7J"\mfF>г7핑H% LB\>;\ M|n1qQ}VI`-ޢs{jѢ ܼE$yt&+ee1 1L_bYcwnJ ָ>z1Y`@zpoԴWk'K?e3?0ln}~7["QA{$4.u/A{Sh]\"]4tE꼼_kGtb ,2M݅WE@ȁP҇NѬz_&1CGߚb((ŧ*[x6&^opWe͡?FS&\Fח%qPJџ̃}9is7JE]\°e2,00ßw3+x8w&ODf#KW{̰ٻ_~ͮ`Ձ:deubuƪ"*jln_'|T7޸Dqhrd5rˉF6-viN}ޞ] Z/Qz|Ǡ0 }$^Ezc5J*c\f3Kn!'tgmn\Wp-Oh@\ލts; H9e̴{XGN]Dp-"6TDp4Iq}xr0wJGoc ":74-&T;b`#3,)t9HvZh 6ȹ0udXDAk -J%h`HDP Tn@|rˍXmȒ"e|F!}>AE>]igw񺩟oo23~f^mh VA[ƴJ6VJ??cmkk!v_[~N4RR=Xa蕟 K2>OkXt_6Bkc`bҹVpb#Vaׁj|A`C:4h`kΧB!gh՞2#,RFXp4 k9aaNEB~OZy8.9F/ Yt=_~(`[3.Yb \WuwM>[oO^  `hn)1H+h;w«xHt)/j| hV7mn.@XAdpKRs`q"ҩIf"Ogc>$L<iX{))I֔nbij(aHˮֺ kd': f5i!b87YMWC$jWQND юXD$ *j=3aoa cVDoAv)f޲C2`ت v͉&c7~`.G@⁛.y-a񚴑Z(ڜ-rm|"kw]:՘Wi۰[9P0vY4a#A29HysQ 9$vv 5l (p8yqVFo: qU)Kv[]rXmI 9Zr;Afx^j!@Y'ύoVtQS? Gl:H!,&c;t G'IϺڣ#b=\LG eZ30{-#c7K:0'~ 4lk@5)3F$7t<]LeCrzɲp77O Sߙ9h3rt7oz|;+ڴhuwrkצcwI`h{өk5n?Ϋ?.JKFootyώ Eb%6ovypr Zܹ&WG0? ϏGY& h,ڵpʨdZb[f! |猻Û]MkNcjRj^: M 4wZ"{:T1oRZR.TQEq飶ěU`̙ k:j(]+]L94(8pZO ǟ[y zQkI= fh/#Et$LEt!"'UkBح.$,r!c06S:$ rڐW c}9@Aӆ=2t2+10B3uS/IN Y$53!c5FN:I$혒vvh1t,Lu}]ggϳO%¼uӨLRїUe,;&r  8 DFTz9;]]̋P߶=C},}4nǗi5.{bo'/?2=<MfW>bʦbuxxɷuD>]@aR/xd!R&`ה1тFQ4`pHn- `m/g8,{{=Ht{%kaFqi/Q`*kY'7ܼH9Z=&m𝀬 <ӫ?"fpjph|?D }NGLgse; `vy:\Ӭ; މgP'xyuca馻WNJkmcIԏꗀ I~.HqяjY4%)ThHmTuu&,Z6߄'ؗ$+y6fʝ[K7KwXC3ObV?wuOLy=wOtx|v2GӅ܍oUlϜx)D[^qXZFw>b)dRǡسbgN]$jE_dHBC-nPԤ8Oy+%N*F|}&<{>:ⳗ.!_ÞJV9~qBqRT0u4U,H:ssr%`۫ďYCi%L[<9#D?\W\){Lm?p;r |6(B)9HhY >:*=L0Q[c8Ƥ}A[]@cN#'t FrWf6=z9 =#jh&^t4-2ћؑOٯ+_G3lK.9r9Œ恊S\RĔʈRcI*T 8) Y(v4"g *sQ,b4ͭ;߸7=EY j2L8C 01e, bT*E XU<˞Z 'AFn4LdNX41cYY`6[Dc(wUΞzշwfA-U K#*RbЉ%@$4WXV 65U{NX=Z $a8YJI:1;! bepO?ߎi~M܁⼃x焕[27!g?;7y{O&%E)3mH"N(^-2:浓hS6:J!hOSI>0ۤnFy_nrs=*ɮ"ð;~ޖ5=w4}f-]j:*x_~f6~{T %K8|9_kxk9i,o6D>Wn+aI-zi_њY~p3L0E%[P8~ ƘqdyrR1@;_AO,Ft \~kr5OFn!Y `wu?_ HUF_jS@6?^RUaHօd8'eD½~46ZJ.ㅟLs>\L>_ucTَ]>2#sҥz7(VPpa2re;߻="{uSI=Κ ʐQ릁[Y fc(tY,mfD3?D o]}!߁2 DAWzUB[wPĺvdIǏ#~D16;}N}N ;q-Ͻ>70eB\w>Ŧ*;;uAבC!mCC/ioG>O[z^RZ^V 4ޫB*rT1=2d .AB霹,ǸV9kdNfVߖy`lJ, a_?`a|AY ~`JG|BY&obYm~n64:ڒ)si?JSd=mLe/i{o>ẅߋ,exxe>H$f?Ֆg{xNz)tri-L:3]hgхGcLSw%hkYdv?a? ?wܵ"@qEc2 xK`Lf(p' y~mMh^avZ7?|2,w)!EEf)99zsNz.~*v>a^VJ n)2{;_RAa(*>&$Ԋ{ F/g# ^$Ih6f(\Rdl/%"HoL *cyך->G]\U?~8+jrr Srp t^n׷{drzIΝB)G*d!ڛp,TʿTrߵ)K.*}mV]c97UKϹ~]{jrFަ%[tvϥ=]$=yyvBۯÛɴـY%h$ AHQ&j.L؞Y8s̢5lZ@Hò`L9 )SB@a&_e+ *H54쇄XӱcHǜiPQ.ke2 rQB57mg XLjؖsӴiMӎioZÎe$-TŽګ'O۴?Ѵ_h}zͽ[•>G}R.7~}GPs2YmeL J梱𞘅7YA(XEy*Y2KA!k%R67)O9V81RVoRh,pLQX{sQ<&o fBEc_F=nYbNڠj">}Sq!cp YDq@@: K,m8gUw 5nj- BhHLkB<.hMĒ\To$ց}2)Djv[@v`(6NE{jU'HHKҏD\0up J4R(ךZ V@]Tz(Ks`Ji ڠJOq BcgB0>%XSB^sf,id(KA0cJT X4\CJȒEU>h$!EyVI=phwy8 2vG}燍-}f(P;^h]w2{=ȭCXܣ@h{׹zփa?2ۜZ#h][a9G6mvY:Z.o7vγ3E+-ty?T|Qg+ϟiͨ lz }H IeG7o`~Us5O٠p4h/ Pzq_,uR;$Ak%#X1e o]A^ݵ]{;V衯6Hh; O׬xݝwkVbeɳ6jǷvUrfZ߶U{$oWW^j֬;Ϋ coR~W;uA)8ۼ/Ʒ\0,x31!YuˁHwQ_Sb!#LmDmU~iNjJU?aAU|5Ϭ5Ȳ*Q?ax'{W^ӫϾUj{?E;vqD +|Zcj׷RnKno_xZqڬ7DUu)~"qk҅Wr|ژ3U@!IoϔCFQKM_BA:I{]IB:9sn[Pv83Y'CrKv"8 ME=J_#} klm-GޱZوVW 3n\~8HfcՑLmOhr{@%u= ޽'N'n601yi`;l仵sHW?wE(/;߁C@V( «cTt/}tIyaK^##(e `N0n3]-d+p|jڼg+C3 7`}!%'+C nc)՗7V_&To l/j,=_];&jnH.:"KlzΕVo[G[7",D<#u % *RSr^j'7~'3{ K+5:+Q"9 /^2F./ b̎j|%gh* ֱ0tjNcƏ P,&0xr9K?޿ ׍Kłb#F>o;*9-Ja#Wa{#s0zaF4;`m8!"PŽıQkPds|O蘷h0~ڷh0}jh~[4{eNFBGSnW/00 &啈|'(^ KYo GZ&by^)֧D޷䤎EvcCNpǭTpm{:Nz CЍOYxm3~$<ЁGSEUX}7dWl Ϗe73\QVMGmjʨɊ#1D`eFRLєӜ9 OΪvƹ"~8gg:32DŊcjL&ҤP Jh4"x9BNd H4wHSCuT`&Ihu1*- [:ΆӒ _ ج6BKmX xREnh$u\wИNBp]B ABN?2ۨ9'j;@0Xk+*.}J2&Yr, q:lF$OP3çDd'=.~;h0w$"r#脕raE5D>r ^qzY_NsЭ 4q# ^Zpqѝty?pCmw#6jZ(XМHb@]J[Mgp (-U ,MPmyD`lӈreSaHd/yt85G 9@`^G=lHa4c vo8ANzrw C!? wtF!r;ȗeps*hJ&cSOnLSDN H8y7BbѲAЎ{H.3:OXX 9 p\ҟ2מ >rX,htV{ۂ֖P& .UZ,*1 XuP-#g7-I]YKe#TDx?k>gC'@N6drW3jm\*u7gy1ϱmVBہjG}ٍ vinP`U?iLb1L$!@! T:,9ħzvzvOfDYF\R'k& Q[$M`+ v:i$\439U\$ Ѕh 0Ne"4Z"K MΆӲYzB1= t Aܝv˺yuOW+,l)Z - }B{Qec[KWlӛvIDgb`htcQ`1B$͔8yl|G.of1E7>QA ZW?ߦ 2B =; -_rlZz|<1Ouy#!KAp VDJS$j A8hlj1Dn/T(<Jq`:ism1䌉(C(;,pw} 4:J77glö^߽L <{:#QE9({la$6PaDsI+ ?xo/I$&JFuIj-Å7Z#F(ObpBP(dY=%f}]mȟh3 ӱ)ÌH&M{,,:jb1"WO@??jj pIqe"18&uH9_ ew$R)'FꖑbDJ) oap^MlPu9"{W/>NJR ~H/*(BWofٍ7Ͳ5fqlv|+)-sN&qYbr^BB9%o%Xj:\$eeGzr޳DsQ$u?9;{}qwj Ӕ7S$߳R*~<4+=vlGw9ڽE= :%siLqAzW_Y>7h6>~݇Wi}9;:@Κo'58˖/Wa`O}r^9/ه6_^`Un1緉${vs s/;$?Lt5J:ݾggxWE/ Z<[_Z#yGg#69%uÅWѺven_KqT$ٌ6%!2eآ*JlY);ՂsejY=@шۢh ?XJ rt4,q$h&d-HGp֫ q3txdU}ծd1Dg(|jىd =!Vu!_BKم,rOιh> A { M JPk3Q#5h( X?`2CN p$u cDfCκwem$I~ZyD^ b0<,03EeɖE Im/o$/,KRU2Ǘqxg0Pp9u edGHLkgA#:a\v2;f8d ,WRf] G͈8iݝ>)]^(qD;q ÏTp#gåQbۍ%6&WDdT$.1-ݑ׈ykKeA"&[+- %4虞=^ׁp۫Wd֎L^t&qR9fq-jAў%_ ޜX~keqHNkG:,u7J(gS*RLJ[xEҋ$9sYF qs[Eo _lYdܥ0x~=O7%´owfL㊉F;J7~Ytg.>柏[Qއ5ǽ- TvUcnG2&|8rH/,oVW)f`):ȶ?QqDY|y~۷ShMx1#>Uv,069Iig~B; ܵį77CR i@`x_]Ŗt}4˔OׯwlkLFvbg٪d͕|8ߤ)ie[R1$.2љ sD6~rߖoü%Uץ$o݃Ohgv=e>m(4 Eb"Ԛ $D!& Y1 ("ĴuB3NQI( y/ń"ܘ҄Zګa6 Yk9~NNЋ[Tnw$s%q _($7ݜw>S_Fs<ن^RU+-X1{ϊ][䂴B)cy1E3IQs-sI^:JbE[zyIӛ_,QLG^L,PNe cLz/(zAj4x M&lϬ\sHck耵{z+$,+m0x8&cʁ@x(YĂIؠ֦RYE'XC^9zeɘ1mjR p \G,$472 0 PLjX^zI;]A(٩皖FN/A}n;|GŴ3>L Ճ-^)88Nx8VPxF0۩\g? \F"Upt^He)},E-s <9#pdEIbf|{{ki;jCTSi<+6۱*<Ah,hp_j,9NtAI¡6<ޤem0tY^DNjr2+w߇9ƐmgM S fX??Ȃ YsYQP*,3  JJadPJkɐ)!$KWKBCM 8JP M D=\$CT1/]КK`Q>Vy @h"@ @ܡSsYJĔ 8rbF$'IPYrrYp 008l ICJ=W;$߇I)= DL/,5͚PL뒍f)7zhS:$]' `,I> ,3/AY[Yo(aqћR@٩_t I%gR++`vǿ|O"Y,˴m lW9~KNsHҴ;Q/:I'_AY03$sJA%K^!kQBŠ6s*UCdeC`R&-Kbqt$B+HZH .pWm9jDݾA_7 F3pusJB9O$ԮC=&ZN% :k]^%V u*>5D'>Z\by>z;5Q!["l@o}_֪NY{𯇇՘Z2+(A@іVkO ;4A]k=eUxVj':iOF"~{2bJ{HURɎJ¸U`\u*miw ac,`OZ]\HZK MVVHXer`N1g#U9U"D+<% f)G+y弩4;hX?MmYzfO]/e[6*;㲺 >uBˠ!֑F74zet3}z"hǤ1vjN& >6egf^wgwD5E wG3*?%?_+wGj VHT%>&2'A̅A$2q"fX{_W] m>v= |vX is'{k)O_.l% ̒.妗=<^@>s)opX(|,ӵ4-E#v[WTY/8M6U/ %sd smgKh\ݞYz8P8zx˫Zf]zI Ώ׻/Lnb>z-wIZnkbG/,d5go,3Qtt[3".;fc{GʱaG~9< Dl͜ qt);uҒ.Nz/Z.4 nx^{ʻAQx[qR7Ce3D=dFB2F.cVy+fQB2 /Xci7O&M9'ፕ{)~g۫/OTQua몋S(ACPs&_ÑD,) -FX=R5iBl,ZV̠s.;]\K-NZ3 ~Uv?jI f&.2wFF&P%S=`Y!F3GHua&3m)DJ&)zǘˆSRg:0rv̢C/(Uْ_}8BCC6[ K%LJ|!_%#֫8&rD(.&o[NIE*),54"-Us"[Mr_rvk vY$ tHǔR1pڃTUΐe$B;ƉP&2qLDV6%cI2tU[NG9Տw fs/F-1$'."&&I ^GN d %Yb6xDʞe@A"PS)z';șZtYICF.>gt*3CjP+.EO:/~C<ŀ*63Dpi}`E@T}̾7 zQEq\]5~Ub nLkTd)QljFX"z(֔Og_tPنǺIp\&ij#d4bEϲ Sd)m=hYFU֣qCIT@.}&csBn*h$T9ڑ#Ֆ[7w ǥ/jnXܻZ[rnkɹyO1پwgY*L\mJETh8dwNlQ}mj;.krƛ{-s] d㧨AAhĴR9 M'4V09ajtPi4s״PNM\R@zx¬KI(g{ډP;r:n|~IY짭0xZVfg2o2dBp xMJ:dgE2iLQq#HB_t #\khfi,h0: C_Agzɑ_I }(`ݳam4xj˒JR d](Krхd2IfD`D69 3bA3K`U_W9%cx_B7֣M/G[x~Y |]4t^R_fIG[SKhh HrbjcT Odv jp]Mf$ DGnni^HÓMF+D/- ^#1>wU˯-HkB57+X>/yQa:w޸D5%rd5rˉF@`-#@/ՠ9T3p9l$cZU\pNuoݫToRr:%.3%y MH ]Y's*MS/9}~I'Cq8cX>gFg0NT. s ^rÜ5Q [[cQ x!tq#QFۛ2=G$|"0F)PF@Ajd"1GRC%2K(zNkq"'_t1ܷhc&XqLit0 lauQ=~T)EǸolS:`k搮- XLjA4Oڰ~} wYgO%N>Kh) e)*\(w>$ 1qV{tHbaJ]Y}g\eb;.nC5_)S7Rކ*/W](EU-ޝ{^޻֘s>uS{y Wk~>pZlpa0]:N_c5LIe\7]vʳv$G -˕֟> /DΒ[q Y0Z`D$4İGN @XAdpKRs`q" őȹ8ޅ% 'g2D'HkKII>DpCLDK P+xGI4 ޕm8] 7y5HH> jw 5fp~̦_f5"w] `5+('" hG,w"k尷N b6¼YwHlcMv-*De^@ WۢGaiq8S,qǍdM_&9r%P+ՔR[3Eu&zGKQnތzw ;}Ő,C{(}v'gUnm Pf[֑Vr&G@Z D2A7G1P\9Hg[GH(򵖜N)bxЄ^@],:z XWwFZHu$HB?uGoF(wDH%k]3MU&'[R QBhUeD9iLy|@Ny>'ΫT! c>)PpyKLG eZ30{-#cp3pv P7>4qM^'U7t8xvG0r0U¯o&NݚG3A6.{ӶfwxpDku>yLAC#k1Q* Y0FFmlFBwvxhCR `]H5wo> HB9UaaR"2 g;a%me)d?F'/~𔬣ϷURy+жDp E lI%t."K@$Q(n֢G(#DO4QM*Ffd j0DR%clQQda6x,$e!)pj)Y0ׄ N.hg=ܴ(ş4Mo@bghd0¤OP$ L%^FĐґ0хsdKƖVxK(E.qVq@L rRd#T'ciQzHLdu4(&?39(V8p!99L,+FJYid\A>zټgjts&BR[EʹbΒk;-0s!=Bɾ2t9'DGC*eqc*h%{S>B\ ^ИONh82O  ,l]S$ZYď3>*6%"<T!JAAbhsj1ϏټU;&j f82wMdTI]HZ|O0,Sj4w ҟun& Ćָz0}hɇ&`a%3gFیA9KE>&1",DDꥦ&ZH0<Hg^z_gmOc7'eیwK4SI+QG7ݟQ)2LK)|s|s1^dAa-s;dqߤwK-X?j`שQ5߅3Wpfa˜c%ͺp<2cxex4 xBˋuʱt۠}g2nC%6z\&+Rܒ H%-\nq}'^?ן`EGչaaF@thqod:stFaVӆ4dQfvJ@ 3hwci+e4!iִ=vÎINg]XE#gĶo)'U:^*ifM.6Y1@f r0$>Iu Ҙ6'RpX)8~Ȃi Κ}5Rts?hU:2‭iU"bK(F[ .9A ԛjh82O~0Cd,B^> '2l6~?Q)ٌM$ܽ Q =DRqLA /r᷏6D 3 2VHCkK F$% Qw?Gv;,5ɀryC#E  " őȹ8ZEgI 2qN({J/0?K8Kq+?6m6"wibBbP܉HU4iaP##my0g:&y3Y땃] (v! |[#7Ji dM_&.rC`RGTSJm͌6>IȚ.E9+ i~fC?$Gݚu3ٕ\K(}vg+i>aّ.#WT(^9t`WO㺩M o q`аٓMPC }[*UoE>z&ؔcD2Y+냉?{Wr HTo "yK<T:I ~(J\D5EJdN{k1NЙ J61>3W:뮻1G7i.DU>9L?9͟isdYM7]M]6oBe;wgv1sͲ8t_]:xew u<* $v*ݵgڰ&^mkȸA ;;')$a͉G7ϬDV2"5wD'Pq˃~:c{GǢ+7:N0|:s>KZ&j4G1d4&ǼKTdF&lE==-x6ԅJ+ʺ,:סk5]ww&݉񘷒㤂onb9^r2{BO\8& ePfIND%VI,3Y8M9ZԅMg/κPB^wBxB:;ɨ9/6ryGi~B𽈌6sbV6:nx윃_TTk_цIn`F̴" xR;ze׎P\@B) r*=Z뉗3*@X"czgIMO|cj 3sY;٨ #S: I FR*U̳쉩Ўq"dL0A8aǤQdel:d|Vm9{YW-jNx^ڇ\R)1I^Z+fcN`,+xU5Y{CN=Z $a8Y0in@ǜ5f $A sx2cԪN/:;CyGgW~6V $,"pRe *BAap\Ͱ|/>jZKT DA3B]z .XL`*Dp9Z Gv7Ʀ!Ћ U#cWjd1A{IH%ED^e!<0Ȏƽb):`yR""{,Xo(P}rx"ҴC =$ey\1$%d(D)z3-?h M6.6J]OsL uȍ6L5 |4q'n>'`u6Fߞ.NHȄw5OB)W{uP1҉G=5NY>,.I,_nu%ampFHPM=SF3 1rLK '@o`RsB90N*8q&B="4@0@"o R* r q<'N=.2[7fx}<ߑ}^^!uϔ'+C] "h4h#g> 4iq2J2nTsI+N47)5\.HeSF0t;B<*Yʹ!@OZSXs%("k`ގdz}_zt*8"rdgZjQTZbYrA]Om;j-մW%2rV*f!w PXV V2՞IgV1R]F:$6I SPMRL$P%!Q%!NJdf+/eLϩUkRY7Iq{frs=(ή{Q!4-iFaY3sKki=5kV VۻwheуF0E/YjƖ湈s Xnmf$h&r{lF.3{}FK6rpwO30#YZP8 o3wyw >ޖAEMF>⟮AO{?,mRp x \~krOFnlq^4zt Ar^%VO= :ܒW܏4㒦k2^?ɔ f~mz ّ G0xd*ҭo o/sR L[e4ȷ_PȷѾV*'5'σYʐQ뢁["ə fl{.t=҃8cVOz|#"rh4f4~З۠ix7D~QeI_]?zHꐮ>MΪ.2[Cxq?nћ2LA1 b%v ,Bq c ʔ:fݜaمD8w! 8Y-*ybL6R( c4D`Q\|V;wYt xv+NX!u/O/ᅤ 2m p)|9~CR:ە9qdII 3yXTd,l"rZh/ygUa7@;3K< XKHQe Y&pv"QI$w=4(Y$j32huR8}<_;h"nk+ B$mx.K&=xy i8c*;Go臘|7jC!!;[0x(A'mgߜȾQZ s4&=8YZ73<Ɍk6''jDˠ)eh;8rS)v錛wzɁ2V"|V{iuQT]ފ}+ii9k%?Қ"4_ <sh@рdqFA^U誐~v(yt!`Ջ~5k TTfsb Wgq L 4z?z*??8nZ`+ '5sv& u짰drVΜx?ҀO4~['•<|Fhw}|9ѸOg]+BkHt: }ђBpqTvn] =Zn"'Hܘ,сMSF*thUVLZ$F3u{VDU`CL )@.2KəeNd=h{ܼrywP8[k|lܔ~mxu{4~8Y 1!-8.64&5e;W92W;k{E}uia6عO]f ZF/LvLM4 idC#9?h'=f^Wgmy~`KOcDj$,zKڒ4_ 7nCм qϿK.mИKZ:i>+6if2~ fN'xTd7KaLS7%hkYdMv Oe-[HH?|RYa&zo7B6<ˈh]"1ND[=0D51EvKn][B]ko[9+F>&$hlbgm'6d;ә9B8dթsxYEu7ΐmS>>~9C_N. -ɣZ \O?Oo1KVN$h.䋇5W ]=7 w1Y'XJ8bF>LkT?z_i\?[jRZ86hdfvdbfϊR˙ ՛\W?rώNu"7ؾf]l*]jɥJN `RI֏=bN^4CiT`Tg NW{X{wqWͯzIjcckc*L:44Ǧ@J%eV2OΐME9G}#mii:ž!(|^vW~认7+f%bcǗO׏e;SђC־t2$%41Km0s5.R5#W/H8Zw/eڲ?&;~Ojf ̪U/W UO*uj*߯g A[PWG< mSd+j0ϚZ?1]ܲ›K^k<Gx4~v:sߔvO2l?Iz#[0ggpԯ|)3y;JE ~x׊d.ςuE,J1y^kRvăz8]IJ1-߿02ż8Ye ^쿄Z枽gTik_+ U67g6qm}*56*.UƦM&Sվ( cSaX5ʍzH6st>4/O6ݭlO~9g];ő~*zky{w1``>nqql&=d} 7v Iϧa=wM+ZY9i".'["s|:Wo_8Arah_4^#NOیzPJMv7ol;zL/#ƺ!v n,qM[o^|=KW][#z Sxg6a>>Ճ?ndyyblҺjS13@j|"5~ľt&==/lXt%Yw#F=z,ziۻegoE8<_Z86;nZuMj1!SBES)pPKbKIR;VzlЪg6~rg';goFcicNXP JR)UZuo ӁOї;iZzN寧<Քtsa+)N=nU?J.+gb*ڄ0 AnsQM߼[z7^wΒZ RS{]1j/ywy&9Dpy_~Z4:X};?O=6_ͻU^z/8~8Xb/Wz 8_87|9>_^z<m*#_;UM]ȏc }ʡ0po(sEWAg?ofA9}r~IW꒯ՓeXA Wm"c dGT̺ .:#1,3q@xR煇淙)L2\]ݯj4ϑȈolJ]L6RՉVQ۬{KOeO_ݰv-JboWZ+#SnhM:+t%p-9NLqz+$:*!b]$xt=}I.k_IzÝc&R(F\L\BZ47s`k1n[żZ!g;|j-Su|ceΗZ%Z9.IZ(I}Tͱ*B@Lµl>քbilX olmPʦ8sVՇSʺl &lh g᣾Zeuqph (՛E9ܻlHsE7P&RTa">gObUB]ˡ➭d7{ ^ vO0a˨+1.g'G1뷬1kp-[!P2(֧!J8jm?nΚ{TeTM)1ڜΩT.m(xլȵS4Jg &luľ@Dxэ֠5|'XAr1o#D>ELH*YlW.HcM6kc ~=0H-HpTH@ 𬱈'm5 ͋r&ϪuQM٪\ -6r%d13,HXfU.#n) ;*x{R]J y;a*kCLxeP@szaH9Xv E)јV XT`t[ \ q96,y?q$7 <(GNX|Ul%dp/!Oscڈ(؜E Nu(h}JXqms6p zM )pWO ĊU4+ΰM@68ͽ'\Ek=dF y8T_Q4WN`VByh,vVLfA =eTC $>`. kNC`\!SL?ghX@C=X0,`ђ44ֶq̒:!{IЭ)51 C ÄQ>TKsvdr=AI1΄="T7VQ rlkL/ vR A3d*F@#)# UWpտՙʠ_N#Ԡ,E8$FdY26H۝B6rP7_&8M'2Й: [ /dڋQLKUT+fuQCu6r'! $}2Lϰ |t޷!,6{~ w/'-QwB-O HМ6" dZ@X8pic `cU)prt9|$Z.]U}-ENpƊBB' yPK$!2 #(|p>3]7h, # 0|DPV#76d n*^X:7ɂNU~d}*y*NTep 5Yrf貚N;>o hp,[[~wy2Wo1K&!+xmi,{ إv1Mb΁6DR. -r! 4P P~3 lcYBm.F(ZNR;0] | ,ٕ;g,w 3ZMVXy7N A/ʀGd⪯7uLUV}4"(=D=;䍅"b+@04 1tρ{9Or"BlͥVxL;}tb8k4Q9F& Pf9v⭛cs)ܴGoEU^ PnuƂZF*w+(aQ4lH%/VmI,`HJ-ShӼ@{n~{ ڹӞk2B]7P7ib(Z 4#p(6R:Z})rY:Z*5fӟFi9BGբGhm2e)"Fl~6$P1⬎c! Jd )#S CAʘ@69$diPw+P}c^f*O+Ij.UJ \ (\B^.zӪ@=ԟPeQ08U.FB,C泞:PՑ-0 `ҩJ&<U=%h pNrR 19KIt`eSFvLf lRLA$`yĢIWr^5.CK*?u-+"`U0P5[F>˽J]h 6d]E* $N!ۑryoc)/TYzZ@x.gkiULf Z#?^$eX*`ZݩUj*@}%諩DvI,V䣒T"jiЇ`"H ryHH鱏eT!V;C:jbn:p)ߒjHVܿػ6$W<+34AƶahiM\>7xx,I6lˬdV^n >:Q u^W[.;9Cp =ljQD,4wC/\I-*ɵ!3p.btP #ƪ+&)r6wg]d^ug5gO=k7f -OKkB a"d%@"kfcN0VimԤ]ߊ=g-&CJI:UR\\VB!}`taLL8 ~;mbV^%;n 3JL G+0¢,f}HwQN"dL&"dUuIȺ(x6]KL ,^*|Phu&0* p%ZPSz+dnNE:ِW`C{9$E0uiiW1c≮,EHY:;T1V˩hGgrvtk ^7F?8+t"(REW-i/}r{=7t#xkH "!ğ(7.`,,,,,,,,,,,,,,,,,,,,,,,ξ^YLq/v2pC_,4יC;kqI9yYhpD;hg"@"$! DH B@"$! DH B@"$! DH B@"$! DH B@"$! DH B@"$! DH׋rAU̓ Uo fm6~R@ %*YYD)~Piڠe.X:쾏V \ȋnp}nDZH]vvљ:qk{E}URoμ1x+ ΞV4Z7 b=A>jZi)uw[R5J۾Όhh?,/`o ~~}d4M[0y]![5RGVunZ_x =f2Ѱ"2/ҜW*&^9aeU[ǭ68fmv vjfA#HfAmf[0P"G+k't^Z옯u65,JUp]thvw֜fwkN s9=|<1onv{[7B껊YV׏6Ra֚'U1NX 28y0 +#T>̵;.jpg٭ko@'ѴloXz^B5o)ZOz_AxԆKҳ ڳ^)>/g:a/90+ h)yoF|>?jvUY*QiW}Hulq;$eg/sXAGΒ{4I̕FU5 57ԬРuc._6_ʼ51oXfj~v\Xկ?]Bx*UAW cҘleŘYbIvhCY͢Rh:;9n-M#Od90JJ*KUo̗RFoCk mgIdžu.V`|Ur9-Pcb;@,;@vj̦?]5a.+ޕV0G}fo"Nx(i<</}v^$%,2b-sEXV].ɑy2 uͅW +M_` (DN[d4 hv"rx=VіhVEbO ]J׸86}wiDrvJ_Pxf:)Mx[bEy<ʏAdA@̌U[Á2J <|_st:C a}w{/KXu`{Cv6Æ@ mXc7ʭ7mVmjb|"N g`q9`ggL'wMժSt 6"൨`5җlH*"ȊkGNM^ߛ9L&ny@lOG]c {11\umƼiL\;P^J}A>`r}5/[Nbrlv } WWy`=d>[`-J ldWՕVVʳPZ|,(G5nSJ\{rP={tmau sU{-Rr%J. %f8К(G^2ԦR7إt{hCr'k 7hw7K[ tuHwgF?&.VF;șn NLJydz .AKZ0^k5CnnŃ4"˼2`3vާ NΦZFW[)TeR [ v+59`k:<le-)9W$lzgTsHGQk*`kRVNc-:(h+JSdvʦ:R^sYb߬1Y$dT,Gi\",rs ːq2L$KNzHkDqGg2["tuh7*~|/s;x,fm}QrK__ :Xacwb:o;W=O:q!?ۻ[֌bGm{XGۜA3O,nǁnYp.V%+(Yͥuj zo |QJ|[<~Dv}d7;*y\+ނ]TW ee23 ~7Dgtաe_(LD\-qX?F}?J8hr$5`@'x"K2WPJMotMsE<JLe`3Bv5eFV0` cTS1C =VKT4~2?F bg%|~nMSFMƋ7 7fC&Gq'`p1dž ?sL <cJ/ႭfaxO,LuVt/|L/"jŶ6òXwYB`u"9}911XJg8re)2fAh]"1Bi9+0Ys)e/"Y#*^1Ǝϝ5gq[nhd:dno?dzɜRdc/ 骼M:,]Ng q>*댖|, "7Bw{0\IW[dڤ>'[&s/Hu2v֜+v+Y$еPt Z u-u9&yF|i?+P`iomQ1xJ]&`BdoR`1gJ{g_X}T_$X$A]}/%<jIlYnIfƖf:$ aez I0MM@"1tڝ3jQZq.qǾZ[Vڲ=獏*h@nhH $dQ 6jtXڃ3_$H.kĨuH^eYΘ6@*emLD.RP,O6 0c!c[ozI%CJڻEżws 7 rVķ$9Z"Ejh[r{-Ԃz mR:ef ?oMI ok4J*!-D 8 Rs'lʃeSʎgSNAc!JH g(sN+sšvTp 9^7e/\{8򐸊.rNgTɻ" P5r^vzK{n,wF\_+_?_TJMbE'cUþ-w̬u*3ƘRΟzPbV I7iz)8~BRbiꅎ2fGj>YA(XEy*Y2KAϴ^ >+e3("5n,ǜ+wIXZy3 S"{>֞rSNMn߼[JG_Ef㯣#-FS-GMOuD;ad-H%t&i"ļNɟpZ1Μ0V9zKn[cWyVصfO޿9CZڱL5 q{r"Vpnd`\f24nO)J͇ܺZH˳䪊::j s6PTc8gsVmCZH0#N*9g'Nz%'Nz+ GMz+dHRHR}ޯޒc(;㯣JkRN*9#ƀx'AZcyH{up]"NpHr_~q Ǘ^u|lH5|ъ٣I$PIOs ) &埅kR}"a#*fmHLI:s [&#gux_k>$_M=vws}Ni탠 kCKb֗^bfErM'`v& Ȧ9GguB8,^0ye|Y{zV舗l_8)\C.[K f/ Ϻ-ru~bB+0#^^7L[XzWUSQ2auxż݉dmk"/A_̇hZKZn#knq9v\is-R@Otrn< qՁI-]ytw_"͜x`-<Ќnv%Wc2cbǎKwBc]Nz/ZGe?ox~*&ᄷRBleYW9ށ")q hT@F$M+뜫9y>ufH-x¦NrJ24A~xE:j|vߏӼe wV>wbL}ِ~EYȳTLslܻ9Qu jzDR'9p~ T)53n}s Q5`k98}c֒'nq2X@srikx^lFV .ݏ?I!;HgB-B% !Z@h &jk ǘWtڢ#=f&(൑oV#e9LZ1od_3Qť-^džlCg7V^9 Y͈oD;=h^,<ܒ˧*Fq.(+){$T@pS45҈1X|G%9xKn5}g;^mePdH'tY[@gA;iĨT@Xey= 1N܂ B&2ķhcR(ljP2de9FΎr ftgSAvf\ LJ :qR$(d"J2FՋ@MQw'8 G!%Τe*L1g!IC&.!1ީXA8vE'u;gi} A qqVzpihE@R"!P7 zQEqt*Ob_&2Lk*1h:hFX/% XS X%Ʋ.Gk#5Ջmؔ)zѓ_5ٳduF [d t! d aޫ,$S҂1޸W,E "OJdRW%kGW+ AЇI1/D4!~ _A.ݚA.HHsN猱`/V4&.KS:yk UH_EWo~<υޔ;_YLУHۓe 0{ƂI()ʃ<19;?PHdLz#-mJy9mۢ33^|_=q_x84HHGz::JyB2gersѰ\d 5 I #@I; 1xæNԝKDY1ڐ*B.T 0f!c@(#!$CR*JH(VeH* yZ_aa2T!ifMCxX3pҔꗣ!/\P_[cCALE=Gũ`8̼B]/N-TS?aq0Vp~iIHn"XAf/Hui&I Ƶfbe6YKKPJ1d@ \&x@D|Pj&\laaZf{m*a_hhRO s3hwaFhYzq7BvFlxK;'N0^O?Jwkշ`^Ԍ-?s˹9s X8Xq]Qn+eI-Qa\њHvp=L0 yeD.q:kヘI{%ϷI7堢&w>^Ѓ~Yy-WJ҅$S_8).x7ҮQB#%Hͧ/~aH_6lLʊp-{ec\Rwh)g~20q6=ák-$Hw1 +Cc|\rInA:>gߋѴe ,__> hh-~[eًgx:^9wʭqq %]/bn6L@β/[\}Cy˖={`FsEp԰k4ѷ%[{p˥d:nϗכwz_V8x_Lv\b׿ٚ7'-X$Y.۝燭ߓ)1(|P"h,!@9FΠLGoGw覃M݅C^qZ gZؕW~R$,JHpmsO?%ۀ!#pqru*:kR3UznmG%2F,90EDG]O‡ĜO⣛naOSQP<$/XqGVmxS=9i=lp'cUl#T_sx:DH1hcՃkKbLjKiC5>) saq矡jȌZ%ZMFg>'G ZRV'yQ:+hrɠ;n}y9L΃&%dѡou*(NcNѺ2`td)F'_:d4rf$ 5V<'_*Z&Y9+ْ'PYe=Ťrj1؎_}HT%KV:Мjqפ`k{F\?hO-,Wm4^U ucUp_KNspIF$c~ل!k']`\vO;3&Wy#sLYU`D? o]9od>n{gOI؀(L'9N8jOʼӥ<4ne3>G!}j09!pCS\su734!#ݽ'tSPtd6DZ'1SP(ͺXS]<6bE%8v/TG^FWGTGQWKPn|5% eWrbjhBds:­S%. (Ĝ8ƠrjS9s.F7r,_1U:fsn(<%/ ۯnph5) c|56N<džw&}C!ѢЈYoسy+3cŞqm\G7[IҨE5po۾u+2o%XN5AV/o/rcӻ5;DOsdx~W;-^^Fw7#[wlYU/7{cۿ]o yh7ix5Sŷ/~OQ'C1{nS(8Ė=f}sK/b~.UփJ*+tߩƻuGijEA0M x~T_-p|䭲~3ObE68j|_.ot`Kcu)|Ǥr?s yC,l0ZQWmT?ΠL`M|r("{Wq  WSJtHgݍ;^]5r,ZwMgm&=!حS(h-#sJ.V:kB(! (ŋk@U`yNF eB.*9(@ p,m&Y%qg[.ۮ"M1E#vC5"vֈ8iI#v%NZ)QbP&{T@1WR'Ą=N:T?I$iy'qTw܌1cD#%'TvdjRN&QkRJhI}5g5j{r_~_kyw(j Gk>*῔_avu\i=*AѪGP'Bp>uq!vaJ6Co6n8(@@ف+cvS;?Nmt?nC_NէON.dk9 ^log7K-??z!/C=BlԢG[T)i&`\b€ )rIsU[sqVex_<[1]'w_'J O_/8FYeLr3Jy&2{Vl$ Ds}?=\}j=Bd )jx)\\.$OH$`O0y֓n# $,I&~准iKkzfJwG&7l> m?mdz-ªۤƤ'MwX+^ 2-~nˀwyW(%`…_X?ν _Nozvc%bO&i!}jrλdMOxW;MV@y{ͼy/TZ3֍dB_q Y@y3(j>8v twg0%öFN6@69klDgE8 ߺBr]_LRo|M/^q¦QF=ZSEq*!ڋ&yX<;^EcjR`+6տdҹRN:W0-s_}AlB~tOm(86U̙OMUHezx erU30S#uIqZbL P J\g1";o4ԻIM7rh@+:tD9ťžW|C0z)Yyh7||Ė 4~-(OoԌF{ȟr/JllZ l[j|Y[v 4:::?Bz'hZw* >Qq8)~ɅaV/KMILa *9@u2UXX%qPVIUINUsfM)T|q B)D>T %UXK(\x.r;wL[R[l :TNHp6J:$Jbޕ̊h76ou#;}bѯאO}w*~,lG|Dw h1q,&fՆ?nͯ:OHJnBdGAh= ~Pj̓ ??TF@(LU9!Nd\[wPKPq.c*wɐrdrx"̈s-[[.<=~<ϫ:dn}tٷ6 Ub#X@IW#SV^7XGj| }RжF\ ZN` S 61qXEP-INtFn ~g9#If.1Kس?,úa)ѢHd[^![) լdVVVDd|qq6v##;qE,O,sIqå|\I.ךWչZ*%i|תP6Sj&3NX`XN(ϫ`1B$͕h*[ znb/ Ԭ/ǻ`@I"H(r@"ViFʀ5}A8hٴ6\:Rr]i!֎seL()tZ[i9h V,h?{X XFޜ" >*jxXTH%PPDsh?|5mfg$o3%#SxQb|,(U"#DErwY,P~ &Ddz}ȟh3 ,*#J.Mx"<:CjbqGW%=|Cu4U[rI2QC!I\EjNx"ZZ2; fHe阢3B$1x\x:hjsB(&P#xLP{BքLm*lk}(;J7QjԹe;Ӵ?1я.Nߋ[Ԭ ɭEQUGJZ]| SR[ň%!'\C1\3?ZMHAvW䇤|lvm6߫3j`ݧN7L[L7*[n\cÈһ4u8,F_qǭ_'>AS]KFj?[F; c׭бZ{}M?_gb/lX# YU N*d(WT#PS ՍSh]j4c*vM56[#Z@ّqzBc⭘GȎmJJe9{(8?m@65g:lgGٿIz+L{ٯ^x4ʮƒF$'BV@}3^r֫=1,PͨnQ~:H+;`|^(xyOČ|;O?nQYs[SacUs|a2yl:ڛPD@  /:4#"J19'"^ʹ3,=g^%fEN )4Rͥ ,cCX KXXϪg1x#=9_|akCv80C©y`Eme_M#SfWUU2Ѭ'leC=rCkA\|j+-><b8L؀`S)%!`OCk&%z5F r\Ϗ=S?o$icvէ._ ۭOnFp0?~z /9}_7c?{K4ʐr7u̕KAxAV r}w&n~-ZLVn5H^ J)Ex=>X.$1 G9[5e✞Zt E>ᾀV JVs^YdB|:2zeY㐋(p>%d0kl u;Th֜w\pe,G.D80$,PT\ll/Qm8.Q3pj-;fه]5nzw]!624=חϫlXf^-]/R?'Tv-vo6d*Dz/ꅎwh{9 !Aj<tK6A C. 8 Q*5"V4 ;_G{VVDG,|Np.uz6*ՉT*M'7`y߇-U;1ZsjRs^~y_6_Bo˸8ln IU Mt2xר$6خiش]<Ӷ$4KA7*h@CJȓɕ`>܂-u%Iht(Ebh18o3 RHF%S69kED*cptZ?obo<|토^rCø?._c~Ia܂zΌ|쎩SzNY鵽uiuC5m:zGW ߧ I7 GϟGWۧvL:BmzBYjJOm{iɧ`eFn9q|]dOϼͣ_Γ=ꎉZcp᩷Is)Tc^3tWMh|9u֘OcJ q/wz.dߧX)dL;=?r5|4p(jn?>wjZm lw{r:O62lϤ܏ QtSa;vMJ[C hJƺ(bUF$Z- "5F ʰJdd|̽lwAo!Z橏ok T:RD†b͢fX*Fs]N|<98^ToQG 4b&}&kErIJ%C@$ǂ^`Ij&ʺHs$'OIC┡P!ATfTrKPHŖG˭ZIda18V²5Eei)!4|8hOTvF7.ufL#V\ZIW.9< @}#y/h98Al6dg#9Y\*/%EPmc"(CJ\){a\.FJm^XjFj7PZ43( ݸg"-F"#h3(DCd,eӂ"d qe4zʚ,2%*1j$63c AuTŖ[֠>vB̂\$b18V"BaDl$qldTT"M G$AKYθ༆( I3P* >J),-gD\f:'[g1*9V.rQ4rg-|(|CEy5}' hQP1@#c*38B[IaMQ娀S7R4nwDRPalPvn/5Y)PMP@Dep4uA6Mm3lGq0 .}ݪۿE[Cg}cf_} 5DEep|83}+W6t "(=0"ALjM<(ÁDVHu mm"X˕uic3vvO;lx]x=%jssLv襤X&yX#,Ȳo9KyPCRIlodA;L[~oӿӭf--ND×pla1 #IB 0;S4E$Շ7xxDI@_b18 |ǔ13ŭc{|W1m ګɪtעRK۽y,mKȸӳvR܊ t5]}pOmB}':~[fTxy%+a>]w:'Qӌك^Ҭ弒݆jȹE`LhǼsxJ&Y=ז͢CBh㥷, ^VuK!3'e_Y2j3d8:]yj8'iM5Jkֆi^Kmf ei|F1?{jzu 3JDszͥa"j+d)bA.Jz)8S,cG@#czƳ6M'tg+j{3.zG~t5R~Mk -c6 ~stZ/b6(8!Yb22s)aK): LS^|rVu[slx٩̢4E0c4%%pTh4\^)BǢ|fIԜIC(rcRRp=KBIǨ.rb /gŖӒ<\q d'6,`+JNy 2r@Ir@RAwΰIBPV{ѱ@IZ&[g5 pbCpM1H@dL&1xu(eHtP1?oN[AwP2[0\hfAD#PBryEQELQw#6jZ(aD&$L2U9(G,U$,͐#Pq[9f6=>+F"{,F*BBQ%-{ x9Zρ`voA$uc^=(F!rG+BߢfY1#uȢ^G-^nkD$tqL2f֧snZh^%e|-$թ2RTu @DN""g]czu.#{3ѣQEr$(ޣe~lxd-uiA0KYd>rZ?^HBH(o[{|*=1 513S9vБw 06m>R=J}`<&)7}AIt*&x/a{6Qر9scIBQ)Ũ2J@&HуClDF"$劢C.e 빱1(I@%.AψO9TRF(; q_ kt z 8b>/ڷvѯ~xu54 ׅA9P}!wx0h0O=k.~*}<յ %QFàM;Y5[^z|'u6%>Ƹ8gHS')9|Zڀ!Aէr,퇿~}uX\E{ H9 tqM&}^Gͩ7Gxˇ<Ia΅⸺/&preg¡9>]u?puzڵF%=w1<EcY (ךY + s̥\?ֳ׉"dtOK͜:%oǧ&^2p\_;N}eOEujN8g)^x#a?t:A"|KWfW5쏹+M<]m@qS+m]1O$8oN2ԜsTjHԖ!չM3UocE:ҍhOimt#w܌Qr@TMST⻘sUOEsk|*+,:wn8ѹˤ:Vmp4|/_H q{HG$2a!qt}nEҧL9(8=B_Pp! Q rXGEj߶ u]P2tfWEBgr(܅kCFU)Z%u#mU጑T y1%r~)%j{Joĵ Lsq$kGiG5@}P‚GTŴ戯uMؽcNztqH5\nvTb41KDJ8b)Hk R$O&ITX8g &_B-m_%wq}ĉ =Svz%{?ON$9#JhUlUQxas^cfV-tXZ8QȄQI)tRKƁfYË rgD{a{",Zy44WAFT#RIl2 *vB|䘇4\I$.gJFp:D))X GEB!J.2(ʏ A!^,t)Ѱ?.f}mȟh3{0*#J.Mx"<:Cjbq*j}_RM/@>)P&RCb2#"DMR'%(c9,#W pF$sK#y4,B(&P#xL֝(lcBքÂh實9tBWٝ.~[c? ;nGpoXaG_'i`={tqA\}Sb N"TGXmfP}_5#/KB?㣿}p[c'p`6IXc͈XQ ab=5M\LqQzWߤY1? zR2p"6r S[i+~2v5βgU~5gg,"D?Y#Nj*'pJ$ÙKTpB5{qkrr$;+z2:}S>e-H8j=!H D5m-b/A3R5ˉ8Sym@6 g:<}mU<[&ٳT<鏛O|-m,pY4P$9u>&)OۛevtW-p}$_o73KzLM%=iWTWYPo#]-1?U?ꚏyܛjݲe-8z`)"fs8CshPMEqNEBޞ #=G%fEyP,:c=NH6wJ[Gc`>P|`4~WsoPpv+' /(}1َy\lǼڥl#v̫Ԥv|َQan_ճO7PdUY[ym rƶ&o72l蚃ap9N)j8E1͙{e[N.eJD][o;+yNR$i`0<,3/ N#9eq[˶.mZv#pHUlV"?VYrBLmFۿ>YGt A8f/lc- fz ,-{pYEn¶kJ]W#+tP̊Hޓ,ta,-:jIY͝)zdc} :v.RVg|^,ŝP_/ooñ.\ߨyL#F]o[CW urSAc *[Ce_i۩!`I>$?.Nu0EȄpF {l"".?H HѼÈ2񜊄šqW$p.HzﮊF=+nz6uh=Y.*^ܲiq`Y dX%\\ 8X4%!yKz"10&3A`~OG`Qۻ7 oaf(_H/g4. Íy`?m& 3_|3Z2Iݝ~Я:W߳AVAh6& GkP{l`!H{yD{D"H17YQ)r7&dN1g#UٖD(<b"/R`UY&%VY Cܦ=pcpޓOEf1Z̑읕*2 8<3J?hfl`o\PN Ŭ:\Ͳxc'q/;ݻ޵}OBnm:SKkjn]!lrl.DjݎnfwOfeIvH-j̵{ﳞ|;r>2ɛ[10.:O[7"C6莎K_fE)2| +B\b]I뻥66Y"̎|2u+NK#oKνCGFn4|HkQǎwuwBw>UjUgX̟Jmwa8YIV;W]铲x>PI—w \6X)Js&[vF&}2$1Ep"2NJ[u+.3V\QdR)E+^~ɉCOvcT*zc[qm-L(6MnS!_5=+}eʳ^+wH)j)$iK.9s^C0hb`y {HA1R1{ʡFYrAXAeb.EIQs- \&'U3VCZ($f Ee_(_xT_k \/͎p7n~Fӻ7fI#is`|-"<ǔ2|rb 5Vez %sQ0齐MM@"a1YM`u)3!=v5qv{l?+e_v5k^[^{@[#P<  PFOV!dR ERC2,GŁ"dB$OA&C&&H&J1F(Nƹ8aԗ8똊cGzDaGZjKi:Mfm(H+̫hKi_),eU=EΘ6@qR p \> $47FBHZpxeXMU S:iɾ~QUj_ܚ'_<_?C]\XE6Y_{Ɗ'CLWC-z.۹ ѣї H[vV7`]n 4< jG#@ʞ EGwe/,zIPQP(T2%WM[@[Yz[Ryօ )H s H#W̽*:#8PPjN[%XZ8!/0!Iו݉dv)h,1j4ک^v(jMj:U,\ǫk'r}IνAd L-Yv:|^ht/:xkSI0\&oUb:k?UY )GY :3o㝣`GA@]m#Ahnc("vo7x Jlы֌8kYs_&-iLJ5׼ k=B2GێZམ.Cb4)8}ZEZ~ju u)AID2w]ƋoW|zL-W뷋ǟzA#MS?|@* qڽ, e9?wɆ=ME[ʱlvĆ+ b;' '@ZPm]ott)|QV jU;a=pԳmtf8jR:@A*͹!f%T_WWS*1qeh{TbNd< (uaԌ?ĻNf!2 e>+>ika_R |:,@Rfqa@eHYr1,\X-P&йlO묳/gt9ar}{h \ilM|?}K>{즗vs4>K.>S\n.Tf~J8,\u6X[[kZbwV/x`a-,se9 Id=Xy'[ۯDǫ=KnK+DPd˃^YzĽn֕y \Ml9oxdI+7e-){U)0j[O.!ph|Id*E 萳N& \z}pYcUjUZn'CtZC<%Ts13cQ:>@T}io@ ]5>([nAx7LkTd) Ѵ׌PA'F XS?X%BA b 8X[$5-B=US=OWjAcQH%& \sNe!c8qX :xİ_<` }juT07ANrA5i&K^2vy\B;Yi&I`ǫKLxrF`Q]`5^G!P$EbtLK; oG.VHȄwqPZr,u7(t!m=Vض+gk7@[>E)K~K[|qqщWYuto턔dDT9m\4,\.bmÒ71B 8@+Ҏ"vl{D;hLJCRY1!WPYhlZBbBvr eqCP*Ņi]%˪d3pM"h s!  UjF䡛G iSϷ"Mn-]FIj=1O1ggTĜ6 ΦyWs)b^"EJ CwXĜKЪzDŅ+z44O^Ms~~-h<78o DXh9 q|ie(P~$G(LcQxJmJh9Y洕d *iDVV⼸"LbC-[StYzt[؍uH:vzKЏgMF9# w&J4VElEYrB]wSZP>y(,i?]g%sajQ>A $|ʣGRn#w$ソozghq!ԸǕ1/o~avh!#7mޓ^h5OGY=hfKe.Uz3UeG8bV{JTπ9ջ3Yli׎^\0l(8I.8ThT_3ѡ^v]ɣ4_nj\|Ꟊ3(:@ ĐCeo8&T |sݦTƀϙ-7J^(Vz@V P̤6o'Mh)}q))PQeǚcJ&?Lrc3Kʻ:Eit?Y-/fX ϰHE6K y!dCzwlQ*+eyX_=}G ΈkqZEQB1YT`Xu(O?b/5UVR!28#J- ,j/c,ln ۅtHR"K 1Rѳp&  CFL.AșUp%W@napf _07U<Ѵ:wifz}9ήa |d6».??[멹g ?_:*#~6uv3YnPQ |X7g|ykL86su2n&AwORo ŽGfjG iޒFc zB2mq.63zR~zQOlkDlK3 M缦廦d vewp1H~8..];E="8[PYz^RM zw_IsZG3lU#'y8404߄᧳錷4eW6B5]CӪ;bE|ۋ@?aFf4WL?F krD}bv{=݃ϗ7=Ө~Vf] ~+Fk&g_^j+q[_;77ޔ@NDM>(#l`q9 ')͠;vsMȫ!"Z(X9ĚQlal5/L NGU 3J!s>ʍn\<`;XPz~f1wjQ7)ٙF7>"6]~n+vC-<~E:Enpqւ9B,H]]w>GÛ#P9vw>2I< ::7?9ZI(P_sdm?U _Ձ%|.ǿ_@6R鏛ս),IECBj|!BHz)4dEFG Iɒ()-$!!ads D'M!JP\IMA6 f:f7LwڝҶ_àNfۯ8_\m mJ0ȮSVւik $䢨 hж AN>y) %~Ћ$lW|B"d *ŔcASl&:D[&#+d YV^"r('ٝ˜x;>2phgk M[=n1%8Ϸ⛗N@PvKSY$k l^&ͶwQņbHm y"/N"kXLw̟?HॲXj'ԇm:yiNt/G/)7_MguEZwx_>CBN,C I)6arNN1A~`C㾛+ht-oFFl{JLiDG'v<3O\ZSJ&Hv2,PF6-)2ˆPEMVz$zuPڿG|>=)|I qxb,pVW"Π)z`.ߨ(56N裎Q'KZ/އGtiqLQVÍ)٤s*a(cqU2&IkH%{"GѬUV`F )eEL, | ޡgMٳz緟Z LO+Sp.o YҽR6c.ɬgm.^[ȭM}n 7_cosVW79tyu>|eW6 mmȟ/=9+Wnׇxxwne"s0;N>ߌG-$Ҷaww!6b~KFo[+MvXrѾjy6gmIO5IM7j3(Lf&{&>&KrtzÁ 5z[*}uFa1l("d0*Rh+UlJ*ϛMA2hU̴ɖBAP Tؙ8;vb eϊk9ozigɟ_pqx=}mtZ)] %Er/ 4f~C)(!6XI ADMHLaNXW$tƐGv%nQZ11;ӎCQ;FmQg4YG@H)F#ȶFi1 y4/]YaHSaXYBb8+BpEdH3!h)0-iI2D;g"/Wu`\E.3-9UǸz\qqWWŐq6:ի`5x2B@ ZHR 3dUρǂiǡx;C}>u} sG?n~ycvqy=eNt>= oPZ& 9_KHRQd>;;Q;dc)bQl@d,W2At'S@cpAaTx j].$T.ɓb*SR䤱tMg7^15xJ3qY'/oOI5SvkVN SiհO邽u+l_%(^v޵$O #%H$b5xIzEE EJjyTu}U]ձT?jzh:>n>))dkO?V\t{KcњB~4@fZ,T_`tyYE[cCSï_[5fc9j܏,W7 )&埅loT_(3Ph" %!2'A̅ƒ@o%rz{4eRɵ8Hu$M}e_ȅ .8_OH?'|.ˮ5>})Y_SUaV沪7UoNRoW11jUA9N#U x \nZբ#r>B5tkzz滋4*fU)g]5t^YRV`{t~҅\*fוltnaU']Ҿ)oSkeБ*ujO*fVzQ5`k98}c֒E0iL:g촓vs{:{e{D' e- s)lQ*mp% !Nha m:s-d&1cҾthA[]@cN#'\ FrkZΆCCOМJ\K?ȷ@![̕+Jft3G.v?,vS*A^nP1 DPN?L^ԘD uJB:G,v/,"9jng{O g(A]1L8C 9 #S: I FR[T<˞Z 'AFnq3LdNX41cYY`6[Dc(wc=l8;YW5[DiԒC gh At4^Z+fcN`,mTUTϯEߣN֧ęZUQ1g!IC&.!1ީXZA8'"63/V $,"$ :qt2q]g2\(zO#]̴Lf\ 5eә߈]KqH舓+=uNf2Ԍ@ersѰ\}ɉY O; rI<[97Ę37؅{tGݧ9shkՊ'IAg#ϒ S>eiu9W9FǹvsEEQWTH%hc_X_U#n# ɉq3Q6VUE:.]W?68Lף>{7AQhtsdZM9 M7V09aj,$8qûM!Hey#r! 52X!fAJ:PΐKuuW^)U70Ԍ-Ϯs˹93|Q/ף6W+I#|a~EK>?H0y%[P8퍾zc8F{qrP&>?_Ћ~YBt3?_59'n!XE/7׽_HU_jC6^RUeHօD8'p-{%Hc\ҥ٥RrddJf\3߷iIe=ܑ G0xd|4\r9^d?2_5pK"9 Wzcyߗ3.%[\}"z,_Ȓ0 GANnoYGG~sF ;[a&;^wһYlG#~/UFo)% E >(pό!@9FΠLgw3‰sN,@J+%0Ēl& ctv5I8:W9Oc=tɰ;Ċ#%Ve2ba2[oy\heB ߿MC޿OrU  t7X`{x'4f[Gbl+9q}LF\:i3gzqcUk.G}UFF9"sMP5dPD A[Sl_,ahCwSBh#=aIKt`0*jt ;^W$$Ι+2Ypk%+= Nz(fSrz {6dJy ݨ )W  ȅvzw.Les>mq82ܒZ/JA_|zn{iȽewVnsLWBpCׯJԖa͝0Ж".nV`+;(jQQQ}jQQQzPUb*p-J2ZNFio#IHR;Ԏ$#IHR;ҫڑr֎$#Iڑv$IjGڑv$IjGڑv$IjGjGڑv$IjGڑmH0ڑRWJ_)+RWJ_+Z8Ӈ#?K}9W܏ep=WX49G@3_rAA !4Yh@8 NWutPCtG/z{i-ִ9d,[ %l)LS*>DtrMD2I s$x И݇,}ۗAR;t'[kKd>Nv:E}d]Fj2z7tT/C@Y;::'Ov=NPm6?(!\ C4*Sv9Nd8ge{}p|FwFGtF;?@h6. x1W [csc8 WB,S5LVY6At$ԐJFڰ*%zE'@BOu>d6=7Vd<Y2 bCJ܀ewlq=:Fx( vF%쎮׫]>].Lw=k{r.w-OԈ[2 8zZۻyMi\v(-i9gn|xw<[{|0Ͷw}uwg.x}dQP=_gHx[o_g w+mm3x<;U?q GިD]6iF;뎬ͩ$#-Uڽ8A{aE=ʀi/Wi5&Jv7:G$($;r(v>VT@mc.!L%"nsrd-7@pǧDVNS!T-5+H:Ƙe#C LQZfhWCѴډp̵咕nJ&\1xB@-{rgC6݆Zϗe}1Dv'Eg%OzӶam߀|U5.B "C\b]Yt&J[h4F*V٢5삱}ODp)ju\Aa0XaZQ>j Hz#c֍tn3P,4LXxT,wsy՘CZ]sze/o@gg?N?8bfц ;&rJg8U&ї甍!ULVBX$nmbQ`)7xmBd"\Y .k8;.96;Em0`&2Qka t)9 nMB1(,ŋJbXr V(da22]3T/XSTX[DE"'ꌇ݆OKm8D6?ED쌈8!℈[sC U 'fE5 -ն"r!!ʷlnJ!1hG"k( 7I%UEI@^cne;#bF?>TyIpqq4nP\qN8 wˢ !GmWOdb)%ZbrVl`,Y)`1`'\<.͎CuCw:wc19us7y?#8o].%B8d٧ھ3fH:#㠋"`w<#ьFwL&k_} V$kU[YL1_#1 SL>kdHեKL,"a0բmYdwVk+5ܛ@n195xJsS(;AU;)7_m⚍>7t_=tzW'0 wIW_Z9Zr{WJHle8 V!WL$)IT~9AO"xOf@gʾjE6 !XU!`P593@3[XsLM{a.*"Tˮj{04g}k`PS6ZfjQtՇ(!p}gŒ#݀P4>n0;0^CMVUB@Ʊy-!dcdMNEODgVd]w~q6˴r6١&0vf b |Nbuo9_n?WԲoPD<6uN$hoxUa䗷+_܊V[?lI}dE]ӆ oNn:iFnSȜY7f¥-{ùzRϼ}GoS2yЖۿWYR&\{BMKmWr?V7:ׯ\++hw޽uݿ, ެ:7sQybwo_͗ߴoiFf/P\Y󄿽 䊗ß%U#!9i@nTxhd&^ODy{M;ovS6/lʲ]l'aB0Z}tf e<_|Ё\qYgtG5@Eف!SqϚN9sTқk#-kv\ _Ǎ 4BCmd:)%ktsT%وU bvN7ΠI-_fF2VRP|4JEp7OOٟ_ӘLsmgi!Bv}>}<=s"k\̣G/v?#Uj)Cя&rXe_YqN(qrH,'D8 "0L DAE|&UI,Q?UI)֮@vBJIAb)\&_.x~ߦ8ڏ'3>?QduxoK+lQ+qyf㊉HU(#), Z1(녪4bI'(jB>3_~(=p ߏ8L6*tdukJj8tE'k FWo65EDkj\^/q*ZQ'R5d7Yd>`ġ].R[ٝ.J ; x.ºR 1fفO b ,֩r: KeAJ/UC'z^0tTW~Lnz[\*XY-%Y.N/P7aˍYdp!Ȼ>E;oݹf4?6YᑇN0zn ګn:6e?]i}ٳu=w;t n*Hds:mmMpUDgDK%U-:+ Bиc؉'vቋ6_L\^x]5*DF`hT4 FeM@r+N[vz]@E4Xkڗ^q2Bx}3vLht sB.g糋/!d%_BVsC4%IO^?is׋֣Hͭ@lw lIF.[Y רExZNe]1)cRO-`f*c]ytw@Wa"l_?||v*xZYbR ŐH8ƢLh]BWE(Dը4U@ 5L"VN0'p%16; 1*Ϊwnm%o8jNvuz\S`]P0gkKֈ-CjgĨLdWDQٰ Z}p.UPuΞv?]{m-li*K ʒ+ZTlT@V v`:mU)ES9)d=M2@NPJ G]XрP (R)SR{;ǹT+_iI',fVIGX0? J9ÂV+jJSkL8F4mN4l4zߓ(nW+(H4B 6HȪL`dq! Q0Y[ռ-JAb,m}Ƕ>'[jB5bQD (  RcH8YUrDyd; ӗXGrutgPgZ5:ێɰlr=Іҍ.)K"O&gr6h+*]lՔ.mu.m 0ҥmҥ?aRapJބ[=|a!ac}=ʗIsOf#DGU.QYp_%pVyn<D&6FA}SV7􅇳G4{(-YZPzkLA ﷔\3ļqݢ'v3sq 9^gr8Y>er&7i"(1kmŭ(08|)Xa7QDIDZ1J8FIEIQĝC*_(X6l^;f&Fp#zu!ԕXԖڪ!8D:BlRcwmI_]GGKqw MncQ$,+߯z)JbYaOLuWJ*S׌K4:D ]߱ |) (_ P\A^M K>_ `@Lr̕jUbR*R[4W %HB~13Ug1ld{ /AEĩৄWFT]5NS>M;&BJ)c8A^oo}ߊr)oմon4^M ᗅ'z5Ǩ -ne\WA\wDxQJi!y> hS>Ӎ=E۟GQ*Dt<ꧯ??/o#(Ч|P?7nfsUhb94spf35 cHZXZG OL3ub9̩w^-ċk/pK0爐F][ƚΡFV'6J@fVD]^2CKK3!o; $Rs] ؐhqJ9,|ۅom·] v.|ۅom=K·] v.|ۅom2s vgom·] v.|ۅom·] vmom·] v.&pF_ӆ,fC: YZ&}Cz˲! nHW·] v.|ۅomgnm·] v.|bv155p4OIm\C ;A ȚirW;ꘓ80ru[fkԭ3l|>H1 5L')R$O&ITĄ -'"\ o" jhje|&7}1m^("X Aeޠ6;Mz %:lfi[8MϫAHJo>x`̥u2Lq‡(5khtJT@A!TZVBUhUiꝪQcuy\"/Ÿ+b+ByTS*jXTH%PBL9/AW;gP牖ʛeL^p)k,(U8#ʼn%y(O'9BBAlnYL~&DF8U$TriRѱ$U0S@D)&ϝ?Ҵ+q) 'DjHLtDBHIjNx"ZZ2; 2H}ʝʄ(AS ʝ]@y0K^'0aBlLaBjKB&D4 ea_&9LrN!&Ilc{0,a7 я]>7M'^j:*Ehsw"4vskBݯ^᭱cbfkUhؤ+|8I&6?G-FtW~h[LAMBi5n1qD]!$|R6@կEΣ\uK:LOߚ<,_ =ۯ~T~lo.(~No>xq'$T%8L@?It7)kmGQ FbrF4>&.^i7O3UDLe|?Q۬9Im=kxڟd]ଙϲk(;JZ/Q%ی]osgr^7tIx۳8 aF}#kT?5jo=[p\:.Oכ/6,i]io|ZsYybUMItbGoBM?qx ӡ9W~b"8y=4G7H?.N )j4Zi1SX KS͝D]3F؅9z&{7Jԁif:G? (gExT5?7dr4KlQd,@=dX.@Xe)zkLSnȖi08"dyyߤSRL¥sB.2,D F'_$,PIfsUPF$73SӜi p2w)U$WXZ"sJ{iAGS5:G@4ә4'B]t|N)WZW#äG/U+Z41'b^Rkḏc臟 kom#hk|<T/47гg|=?YOh +r.@wx k"P%Z3۠/`aV*[BUUN%PIP-ۄ,-w-]J z!HʼnPݳ^^I6_"*`! zmp2kz.tPga 6cf(9[fz o>p4m\[9o*Z&Ce=:z'(SxYn?]g`Ġs #OϬz> n0ҙa<o~t}@7 mgj˖J Ξ>r9v޾/7TVAF֧Cf!;˞[_7J6Cl˾9o>Pe+< _NE=;$ْ@? ? XFw\NwY( mR/Nj5B(@kIA%[l"ҫ ~ȃ ~ ~~@ҁ2rwE :[$x<tF$XP ,IDtzR9ICBMOftJ`U Jn)`-cgl3ҙ-L3c[Ȋ-ܫ-^ed|H C'~{Bmu7~[iʂK2Y+ %'пDo B̆P,bgE-"X,Z?֜rB'-;'=NlOS R"*6mZhJTqIRQ k2Jx4ŭs-bg&tmχ ><`f"x fNцfʷGww˓/ U,>QaDD&`'EHܭd67 1~EO0i-iv]VEG&SkX};Т0' Lژ5;t-mcu۪J_ .dsDlX5Z_KvPP EbPMv[flyހõay_N|M,-$M,7fQy4/SYs?rǯyeLr_?QWm'AjԠZ ڒuFS C}Fk-ѓ6uEM誓nhn7CVP' zv蛲b@Pm(:q1hT'gJɘQߔzPY D䢶t%/@KqX8`g=붜-lWj> -Iti0TdKIf%TtXQvjQGʀO1]jj >=U" C~aIrV7%[ȦZmFƠb9vF~P+]',~#>UsbSFG>N^ eS Rsm#= GñWG{db=:2jV$e,&#.u9H!s`]; . XZ&5CEDdm/rI䋞uo#8W_néӂ^5LJbrT`򍳁9UæhcS:/MBvՎY'bmO)O$CU'K&́DK;Y&$dJ6!lRK:@(XX\+pR ![sTD7u[=T =@km3p?PhrY_R6N̿~Į@G/je^Q'^M'Y.N0mL[%;#n\>29>]ScI)=Am_8S rtc{N>3$v<-K"K"tbI`?R VX'(m.ŷ"*W%QW\q\̢1J#N%m4ICRQ}:F'v[fgV,tB$lVpdgSrJBC{cpJ b}>G2V֒1?q [̢ Ϸc;glW~rNd(e5q>(m4Հo s;?T^ Wu*yA8k 6s9:+} Ώ[[TJZo+9^wxVn~<fTnUUaAg}keT$F'UQ N7#Y] }RPF9!ľsXZX9IY&m95^U]zכ5<=)~,_ȳ{Kk*>_;B2%L1=Qta .C љQ ڣRHzBZ3ULr1E*JHjdEv^Z%XÉl#r}U,9goHNa!Q#)6m.j7?rtVtȾfKZnȟeye6)S4B&uqZs%g rfSsي7d]^@; -E>vh'I v:,2)x(,P,擿Xcq.P"\gl#u V˩j#hC-(w(f5jBDժ?Ͼ!ϵ5mU bF4¤ȅA(PMmTz \Ukq"U]ا5(kj;o1eqgeEc 茱 )!X %)~6fkQDwP0:Hr '>~99[ivr|ւ<&MvM+i/^\_٪te8P >N,___ߕ@uU8||g^}E4^]j}ߚ~խk|j7Lx0[.g`|N?|š-r\'Z.Xx[ jhv<諃]Y|5;\c7'HA>Ӄtqף#~>Q>+j ,6xEU}ٛ e xw,g)~}7mӻr{˟^TIlGM!*N%rrZ2Dor)qUIe!J4;QήB՗s΢F<&M/xUG̿[vSdY-F-?qkJ3N6s4(/"y=Yƚe񾏯uӵo|*H """ ,0c Ii##L l;u[y'f76ܐ۫&̲Xk>Eyֱ= (68x3 ͓*J4$K6qtĨ* %FCp`=7{Xui2\7]bjWzn%$A&)jNY]2'FKy*u2.!W_MĂ{\O2N$m־v[ݖ*0nTvp w?MӷSi1;%D12xiNqv8sRKxM1ihh$D9iaG1-%9h_ (Z(.K 35d& ZUhSlrMJ8#xF'%k;r6hIc5%dȒ 0UaIQ2Z2;J@U(P~ɃCє݃"?8Q{K*UXj[BJ18ʅdh=nd݋לdQZ=d*/A,f -WMVc0Ϲʝj_ӡnMٻ6rdW?/.MÉ/`+#yQ`ap";dzhyu=˳&㯖w#{2rqPYf2d[h2Ee&%)SY`_x@扭C9p2EͼJ?}©+eJ~ 畨?]J:Q+EZ)QTxFPNRNô,֜sDOAHKgu^#MNH$DjN7ޞRIm{@ȏ\rv12jBڪXo0^v\Pl?C:|z3OkX.)5)pb{ UG! ELs)2FX Jrk^^j9>F4;pP<*zy4( n.h0v #yxd0ؼ,؇un6<ߌ}{=pϔXq [v۰>eˣkNڸyL6zRh1uqJ[J׼sUc.ze:`soL郟^RN892%+cZ\+;t#ei4TSC.Ki2īA#sR!FS"%YQkJ'ΔteA{Z)βBm|5q'7RUhaH㯣Ƌی]S/rݙFcLe ˠ23p3VI :!tx,{{#i Ӈ~o`E෽}Jʰgo;# ӕu3Oqf[C6S9D8u1.o޸y3MEZyvqu;ǍTh,}҂w[5PbU RLy{GvG{,*9NZU^{nP/۞iҔbZ)>p8_ `p\Ky2|6.IO/G 4& jС`,cq%J0VV>NҊ>cIƤ`.ιIBAfa`ڨ%^O2KN#Z aN&˄16;kVHo^sumNYv>*M1Uq)՗!^>Gŷj$)m4aZpQhpAۓIx95WkZZ*V|Za9Jp3OVRYsxV,DãRsrB䴢)ЪA#:%jRyoA,s!S3U9h8淙c8x>0-ydF#n\oYqW^!Eؽt> \ftx zN8}k S8|NMr u8Q rxOckdzw8G={w?ͦClI%YhzzmZևxݧBn|ΏGTN{:.]+=YQHRMM͵%`nfeëO.8V ~h_zls(R"ٝ-Jە"UmߝE)~w* gv%8W7lLuvZE@#:;b%s$'.5N=uF"nQdYi Go *[!(>J KYzcmnQ%>Ikc>nRe;\S2i:ӌk!AB H5BEЖ!pd̆w19 $qh%7F2z j5q/t>is&wm([MO*ŻJ^M/Tڪd4ѠRHܖ eɈ y%6VK^h #fY윢^@ſ£}|Jx|r\xIQn3{.2ʽgJ%)f6ꌶ563U9LCj%~Q\31ۂմXQ3؝劍@9@RHåUl[d)^h Uv -dd(F3A!qD\Y4v2Vg?6F2Qƶ b58EeD="K=` 55Nhse |v?!1șjoTsm kRI 6+ U<҉m6GecwگBT~l^%8"':m&8ëgA1FQosGJi<2'x`/Κě|k7ċJ̋YÒiinuDn$kN}^\ "Z8AdA\,|Tty_x8LSz z.^I$h0>*Icj-1Oxp@hDORŦش,@zZӪꎞ<x3*)UMJR*8$)ռ2@Ud`JMŹ:b72\P dCAC֕Vb4,G)nt U9k.*/d!+uߜV4Zz]o)ڴȵ+DhwץגO![7/ɼq{:]]*lxw?ͦClIzzm=ܭxG˫kݎ~Xt]<˪uSnj.l[Kn l(G[ڹzlsƵbl*]d#@h9hWt~&jɮ[dV"tJ$hP*) 6If̜<$H68)V]m#GEȧdb}|7d+$rܖ,S4'ŮzY/J8a (.f]w-o){eL :kƆL톚OkN7gZwם`m?lJY/}#_6ll.?9*U5%b7d&,`0ZrVC1쌠("d!jԪR;=%E Q* COI3r|}d_ qfXL3B ՈݜʌOiypŧE.Oӳ+Gl4)! %Erv2!dd)bAa7Wg'% & 2R$\};A(dWٙs!8;.-XPvaD>V SHQ$a, ǂc_Dƈ#"$QfyfmI*:,C?-31(E@@ݖp$07AbdQ3!HM62=ik5ƈL]WȥcZl%nzToMhEIIuJZ)^㤔GpT>֧k}QIjH3_fg vN$L/-eRT3:_:BLJcM1<`p³³(³fKlԤSfJA*g"2FsNt+$!OAfEIٙ"j8zm3Jd[_1`﹩|u~v1oϦ/{Ù&Η)Cqu!6IOǜζz$0˂@I^{ RN0E#0w$ u=t{N>}.$;k|"dT"f)H("*=3x}њ,25i\DbwO8tZ`^"Aj)*1Sk&vƘ8!nP?VUVBNg2dd!)XyAu'(J'B W}F}k]vVwWi"}okF,v=vo^ȿaVkҒ.mF;"+݆䟡?-x/roSƂu"=`"oPZ;2bY0RuޥrFyTdII(3Auq3qv3>vS?)ӎĞkj}NL`@x2"AJG`pl3s "?`E &A4PF&FRIBZ+V8wN WaL>ۡwh=vۋKTk=.gDWGy4}@9OA( dthosRYK}gAdFFi<8zzqA_ #7o+ OVȵ6Z6%S(H&2x F;>nRݕ  mQgw R :!i+q䙉I碍ŀ ɤUIN&ad`H\d [=5>`Mf{ OgoT.6cďZ#cj2m%ץN~'6)6cՉӚݝN.l~狓>Fw"M9 "_y,~ZevΟ`3~:tuXqzk']Cp -Nc߈_\v2pjmt.]yX#6/v&n|ÿVR>&"䴃ăy+kVRнC#I-thlx̭C"ߣCm{!ƐC=~di`w8\eL8Z*5vCFXɪgr;ӎCj;T+>IHI*tB\21"f[QhFkzUc9hzҍoe[^xX #(|J4J#v4O#Ne-OٔR7/SRN<uu"˳BrjbLn,H9Ca=PEL5L*v%G BS>d6}TGp>'QX J"D2HGD,(kҚ6g;e6KLxyvBRoy=bAj&)PL);)湏xYWwof;'gUSE7!KZcS!'I#Nk׈c]QOB1a*PQI%#qJgU8`sΔBn@Clrêr1m.=aEVx0cYO۾46;WV/vWnn?x|{¥H` Z&,ZY(( ÔeuP@S].w&;)s^ );=yWBmʘC%{iR:lT%›݂YU fn tNߧXR+F`efv9gʊEDD iv+",{ft0$'5d0P JP_MV^ߗ5*mo9}!eU/2ӛZW~uGQk%柞&0g̍$. xx/DuԧIy;&_],Y_1(Y?Sos}ӿ7o,o~Xw}/V?gF=X_`G'ab߷|2.zMjϽ;%Ե;q83덨d5(۾?zIA)`I4[tuBwL^} j$1}R{|'Y"}6Snxv2LWq {l^Z1j C"'\!Z\2INik@'L:W׾YjtPi{*t]Zzɮ[n|j=1OKA?+зM3BΜqϢS~.yBk@3"]𡯊T0Qt#%z%|PJ,Gwvd"X0/x`AgJ@q")#uB"}ڶQo( 4o_=N`As /ĂGBԇJ:`1P,?g]N%vizoN!]2KQ.:PT4, 'S-nbVv߃.z֙hKg?$=QcEw !z'30eR i48GJbCh(n`'!6gmH܅V?_!],Kp ю,$L(Y%[i#]Mֻ΂^sƓp+:ƅ/q$(LH9?wCnx{xz׫_ߦ/dw1qm'Q_'iF>MHޗ|Y;4yF]}[Z[# "1w}v5Ut~ijw-F;C[DN42^Tsz5k|?T|ERd6goA3 h c\Th)'~aruD1w~'nZ$Y;JtX>3G@YH..%9 8oQ='n8+}+b6ǫ_7Ahc 4ʟc)4bI΋TeSQivg b?f0ƹR2J1Ja5Pm h*< M, O*q- 91+,)ڢ7hh/ t.T{rDaxUiX'^ W>sZRݢԈ6zDcg|^p=[p3hp;@YsDQ`E;;Jq,Q Z}PΆo"1p&5 "p-?vj]AvY授]/@]+caW@kWPoM+yDvh  "h;"(7{$ Ͱ+<&vE1 \]!]\ -+cfWRwZ-*mBiBsx>:@3S%OOۯ?~_gˡ _N{9c*B 8 wosشQ* \¦ Z{#4v`oM[Ջʮ` s"Jણ1 Zcή ;î[>ΖU\(M.)T?\)6=cxf^&r|TtmC @S/ <$G#~F ޷0(B_?N[Jܕvs]}L}뒀65򗾡皋y.-|(:"%\*V.X]A< _GxAiI7.3T }"obwaӁyh"ݕ%pј}q})+LHfK#;ϓ~P8|ϛΟЍmx+A'R%L %jfޤv\dc hHg Eם& A&WFT\$!e_d2N@u]8TAdO-Y s'vΞIHcF(8,hEq؞l8wn1XendD[KOʢyU=' &e(J2E [$* %H)e[4^1&zu 4,` :ɨ!haH$ Lp)FDJ1칰ē`,w)Zyt LQuy1LC }Gm R5an68ݎփ;pljy~kJOS^|G3."t.|F"$9IΊ) y@e 3_~H Sy.$!I B"ԙ])^sV!Mq4d2c C꺏[-tq^s=+k\ɮ(d8b/200Lt$cg9 A{Y{7DMbt82PRaq =Gّ'nݝ/ԋY/r6?€1s碉EK'%J4s28,<3߻dO%fwZIv ͬk/gm|u Z_3?eI8շl]Um(u6e6?Ũx^0`ڀA& M jY\ y>}]=Y5 k 5k(Z3 o-aYbY>mP>˃7LLe<_ͅ 7 j/Hm fUM ;'~ڭF+Z(]3hxR讹c~wM {wLi>SSvP}Fl1@{:p3@v@} ">)hoxCV2)Sw;Ht|c0y0 T><3YzD1qt7(-g9h8zjmg$e{){iTo0*dLȘ$K:Hv6щzѳhEtt$`q*UTxESϫlr:azսnf"";KzayUol\nJԿnVж˶-b,dq0y2`"MgF^.%5Xj`Ny_[d 3qϼ>ѷ|\nӟ XD%@I5ѨRYd )&eIC('0^mj w2Lc^i%h.dcN#28Z^EuٮyX %ڧ||(_!Kh>|=jO'" H %gsI@PZQ[iD } QDD 0# ldXL· 0\.bN2 bT*E@YS:x$j)c9s &m( ǤQebs6V%NvLg'5tV>{B[ qC EJ :qdHY ^*Z+fcNXij .Iׂ-ރO2l'PwH3i*' XeU sxJ94o[Պ>}Nj>zA q[qVzpiC @R ܇T7  ^g8q*z'k{֐Y%&BMP2Kp 3yTV@+ڈ؁ں l湦X|ы_g(r1R|O&@UH%"*B2%-x`{RI\1(%WV8 ut!RGQ'E' '6D8n{>.(v HM 6lm&l>ElhqVt" iPj筭Q:hTǭPKޡA3,x'KIYocDQjk:vBRۂ0{ƂI(2nWZ#c.3 PYvsI3dɩ UA6tuEo1y*B'ybq۬,<<*2 ٗ/祳ᝊ#%A8hGĻ׎t1j^ |HL?;PL.r.jV OP\r0)Arh *`Avzv: ;~b[Iaf vq.`,i$9s~nIl%vۖNQSlX|XTV̠6fHB"2cQV G!$:EB)B(JH(4 AK"g8km<CцuYgَIC+6 i]j={n٘דGy_Fbλࣇ/1-XOi-p>OGzg_t?yd4 ȵT jǭu`x;B K{XP26hv\h16k CullcR"m#5Z[pno cӼ.Oݣۯg6)g( $7IIAg#dASb)m>eau9lNy(TֻB&c3z0srd]Gt6+^d4ukv}k둳(7eR-0m!f1/ d^/Ѥnt/48hCJ.~EGQ(L|rsiD/b.g|x. ?n!8uьL3A#k(:k6ҕ,MKǭvܮBAF7ҵV-mI}ݢ][bF7̵9/{Mhv~`l\7Xj\.&i|?jUpQUd=:Hd9T՟I/O9=[)dg PNVXx-q3QVS!u6u 祖XI2EbL IKMj2z˙@wIj'90m~ A(˕1dN"p^ p)uQ2-mBNȕ3C2rr,ɮ#Ύ~6ݧd_n)UF'Gk^J6oK31c^fɵF6JOM 9 V֌b Y8/MυȻ,thn#An: ғZpREg j 0RoT؋}zlآ,k?TpE,L]Le,2xVԪì\P؏1Yp?Oښv"b!gE+Blqav"!U. &231`9YBv!=$4 @k)\U4h_R$ "QFdTOR;__<7< eL5/UV/pZj\OOO'}ީ,v;Ok\gUo/x(g5xtJ+?:*àw |:Qs% X-_{ܝ?dNX.SV$7}SQ5UIݕ)6ޣEǿђ 1\Ҽ a9)ޗU/'8,%Kx xV U8gkoe̱Rњ: GoMHMq-np\S?fq\r| +mq8hdЦ/-U}L~-$ bZz`+A]B~:0:h;?āZZ^1 +Y%v .݊[nyүEOp˗'ZF"GEǾ*{M=>(ų^o_7PhmD=:]%O5] yTlp дs?&%! a`FGnMe-T;Zoz_my!fsS3c*V7 ❃7eM łcQKId 2# 0hg3(R۳[A p p!sJq d$I}Icj1]lټWpa{ճܚttß^&L,Dnzњ;0BI卯Slk},&ja꧍0{a 5Β9㶞v~c:.y ,Btt>io9H-XN%M)8R+B#cu}Wx48W{z~޹S@Ԓ ,@FB9A/ɢ2*2%czhdoVí7R [- Cw2BYȼ5"M y)x]k|{ߍבTH{xUaXN@P2$g}iQJ^ ~`}I^bl~gP|w  OOt}:Hie+C`"P(5VҳPPLpFwϭv,Oṇg2sq;nf[dr83WHM //)mx=%6c]6eOyzf^M>/}T=`:G~TUArI:']\gVyded*6cj}?knG/Ĵ<}.&5ꗵq}̢5 wJN*F9Q L'j iM Jftd)\qP[ZV@"ַ_?auMA 5Z>`UµF'V3jW6KJ8]8o䀢軷^}n6<.k]^GbK:[=]X|P"< +.!Ka _3r^ ٵm)7eluvN;|^ Rs(ݒI~:M%z0w@45J#kH\IIhӧ{0bi={Ax`*!)54PBB˗ǭi` ^Hөe]k#6BH` IzR%\zg{:5ZOξ-: ifF [8=L=}_Ե]P͇x'sM{??O:,Xe$_Щԃؾ WNV rg 0PZ"K^d0h^ glS)I%2X +\2F%VJbg/Ǯg;vF:q2z#e冟G;t.sݱ*HK|]nw?7j{pSȒR(6I !ܒ+9a#=b􋷓;Goo$|6C~d+ XA+e}BƩ'oҩ?#>ӥuӒ'8Y%9EZY-NΧ\%s|͛ X^}Ez {a-@d2$Ch$bp.ifH$Gn/ֈEEAhWͶGZ/6T XWS ׸RF+jge`jgX4p?ZJe=(dPg%h- ^6۰o14^V`onSoo}k| 3X]>jݧ/YwA"ƚ;zR8_[l8>hI`BaC(.Q;E;9)EdNڤ1lB])cɊ5xFi,AZen b DHч@JYJd9; i87qA|r 5\:nWY-hxkkb'VK&2ITn)k,Prnz hkJd|5?F?pPt{|vyxc /D|xXZX[.G 3aZ]5F_bx3tXldz^ s4;nMsJ^ ٨-B{OVF942 ?7ڰʓy5 ѷ88*&qil5c&4nͥ,>1%gOAimr2eȴM,A"K4Nq_gR[C4֧Χeȧe§%MܤCb)XVgN@;OTLNTvջ+UPwwOo%3DcDJA2Ah3JCƈdJ*c92zrJ9V5Y'ՃArt,A Ϥ֌ gatagP];օׅO W;gɏo/xB3` j?.Jۖ锵ːu^3&`BdoȾ$<ǔ A1fbL\,ƞOwmH_ 0m(e`] 0/;oybD,:,;G7&U_ЪMm]";ǢUe%Q,CZW i5;N'yC<݌:ںփd9%%jdu60GBˌ1"ZV"˅2#CT5I$Hd8 2cjr!4͆_J*cM"DZKD$ wfQzN`+iUAZDd,!ՒRy! d]KBX-X'uvJ31,YĚje٥[ 瀴go:\\NٌJ\4\BOtI-xvu HɈ){3ǐ"QDZ68y"aË8?e?`~grU+֨lDw|)Et1 :쐁grT#c q.\ F. GGS[ar~ l6$% Y^ +!&r;֔l {Aߗ DR)58~q|ハZbTtP f9Kq-MO#Q㓡oD`ԑ\~󋀔-ۇE=pa@r<}}eYVMg!u E_֟%&'BV؛uVPPʜ3ZHg㭘A|KYm W6',)Ǐ+!r\q^B``z'c@VCpZ~  -rޭ"WƧŬg=MD'9UF YjbZ֚շ* ,?4I?tzIT<1Bȼ^ ]jLaHQ^50R[F:Z+!ShR% ‘AYF"H6B2Y ʘMRm ˂g a]>AƽiuP{ae6%X__sg/j`=i\WU') :>Y?ºo.yaQNLűZ_^|4g B{􇲴ŐFUͮB?^~Y|g5业j/,>g:]?5;jV0_UEM)94_R k~ɇ_/}ehIjp&={ +qYNga6ql6m[[̲XvQ$@5PƸ$ˏR^?L.I*'hKi_oӐ}eJW*:MILS6&ՙqјAz&ɅrekW^t 󴗙߇+&ٚd~aѽILQݼVs$ T8g8_?ovNy5*lP3yGoX^FKKsƶ9 |vR7EތT!lш"YѨh4#E0H1@ΑyoOMsSԄN<Ҷ@(B_T>DՇ(DAe*,qǂIVǂ{?Tǟ֌-;DhuюMVDbkJ5@~k?puac`|&sںT RuJu S _G z ֆWkïռ:.ͺTņ07 N,2밐v{A!N-F7T̫|ʓ晵v ֧Rml|ZCYYAE(5JA CmO#&$bLt5($%1:uBrl8R;(uJ;'͘2(=`ތ7:}.YʊUW`= 3ijU5hЇtL˿IeΛP^wu &HB 좉R3 h(-hHnw5>\ߌn&&$ѿ֫{.WK{PwJ/qx dI9(Qج3@@YA2db.E(:{@!ш,/ob_w=Tj n5kތǹF5D&9j 8~2Uǟoݶyn20ZPDV{:~ 86ƪQu~mߓ|2OYϪEd0695IoX-_[LPt{^~mRjÚk*ձk;P::{qX5N8U'nOXDӨtufZB2)uɑDm$.ZP3lв@(" HM:F6 ȬPbvNO+k ED4@9iUȐZ6~2zzZ^屡$F;|[Oz '-5MhLl6rRSDC"{([I&٢<o>I'!70o]wY9ZΦ7o)Żd5rb7Ay&F! hM '/l,rxMcF"]icm! PrRePnwԋh8xA22 ٣@2Ȣ#9P[ uN5f4 2}/8^l!-J'ށZ_nz& 6Ƃj\ZEdgX:Ǯ(Rh;t PA0>@׽ /e5XVOic㨙/ GmJH\9u&c&'wk?FEo,$5^OCѳ*z*Gw5]3, 6ϓn/F d:}QR`v^ R?xiU#3:ՏYtk_:MK5<ҮSM{r]|ռz>=ix!Ι[Cyk̾5}[9EߌGo>"I2u  y}Jl քG 'dqzU YȓPU=hBR.iC`:Jl̅B J# D兏^"Q u)Ĉ*"gtEe(1huQ?͆s`bͅl<ԛP$\vrWQ}ժmֹsi¶ĂE:ؽy<>ez j7B彃shgBx?jٞ[wܺm-Z_du.Z˸mrˎ[vKj=]o|7ZtG#n|ͳ&wX^,~8bM?ozG5ijzhN-mN8~ !f-BtVMm0^?z!#!/T07E2Yfp}@ewܳȲI\ Bw}ȋêTvwB*; buf{[&[PUI,E &E;U-"4y'BArTz1 (SC(W0D)*E'),$ShSFeQP QZ怂L7)$Ad&qJ=HM 3jgTz}ݧeXJNPU.**էd6lg6zT*96>{VC,%Xk"C茮0EDN*+ԁsQܿ\T,z(J(R 8VZ4 ֒p( 4(XYB5GwNs7o 8_<=4Kl_4x<0|4_O,}eL4޻2XIdYd<' ZMHAC4Bxۖ𺰊KU Y ҪMm]";ǢUe%Q,CZh5;N'yC<݌:ںփ.grQG2lDFhyY -3ƈRkZCf,RFkȌ ydRM\T&"e2/ʌY$VɅX6~y++O*OE"6c%"4ޓdW0`̇$LtwF;Ց;LK%[ NLw#vqmfdRS*2Z TJ^lp5zAHZ%Bx~ #$$8pFBb%#1S, e9 H\j\s.pZ;pxOlX)_"ba|rK%Z9MhaทB`&xmL{.lر)?،e >.~|G%[w71i3Ox) Xp ئHHu遣 TI2 99zW~1[q.o[9@ާ((lZ<z+hR@;.2atQA&* xuEg:p y<_UdQ/RZ6栦G}<Hz9>?=w1QJfXpd /Mdalbk6{vQN뢹ν{?&&:R:8F #Lv9oy%*9T;!0eSK)WxcD2Y+냉KM-a$EV 1NlഌIYJ_UBi7 G®o0|ƭOtq "]fC4)| &}`s G.F[0Tz""C PJ8N8B;1G` ( tA ["S'P @$I7zz3ǫ.#<Mޝ(1[;`OVCjK4 _8;_AU4Vȡ8$e..12'n8h>q6N5`8( F}\=TDH ,"b7LИN(P='3N [-:T I"=ha$wp-H䜁OA*)QB2#Ň*ENoPN\HwS~*φ8]8ܷk` 87߯dFDT[iYа5\UQ3p9l֑-02֐[[nƛ7Pm k ӓ6Ӎm%^_HXH, .K^5!ALX@+'_vɹ )޷jZBWM?biYKoSi͔ڔBlޤ{<\+w*YSx u~o]CFY1cPw Bc-QEw3]|nHv[cYҊ jլEv|2o3c=Mlo-"̦7ua, ^+WٹϢ Dij}1i!G>z-B;Qɣ~2=vudI[ʵZyy78{ß^4NOR>?Ƴ߅6Pã5kfz ӈ(n*#,781,L/ag8LW{S0nz1zpPJU~J&|ߘ͒:p}U2-{V+1y?8k~1gΊ)1QTmW $Vw M[ƅdo_'ZuP9GJq9sG]GwmgqT訮'=s?R|Gu{Bv5VNqe$Z>#]ocRښJŢ.Q{5wK%4e6FQ.ΔKe $ $:`AY@,h"1 >Hu6˃r 8M)&SKG,;X+\$JK܅s8tyC?GiqwkM}"R[uEQ&7Hm07N vqU=sx!L~*S ;m69B!9>Inf_wjvC7{.r'~`%p,q9KM-Xa9l~g_.&8Afoc;27% (V`F%)*4A.A##4r[R80y{I#ޣ"2[zٻnۡqΗf:9Ǵ*trk#JWcϢRhQjp"2$ BYЏ|2!NcF1ZleF͝QFGsT KIw PlMYOG$ ;Xz4{VWm]yɻ&i 3gKWRl~4910 .fdrH8+;-N%EOZڒ-Jm|,S;#mN%e;ٝ*2?d%Z@JF/tJÃ&-$FD)G? *~0MV}hZt3hfbW `u8\]%hP]%(e'J Z&a<P1^?Ǜ ܛ;Y7ZUo5-73Hw^;/n"Gf &amV(P*8-Q3n>mV G`TYUXS2qbp"u~ПB Wmlk˯Î T\GV]ms7+,ڽXʇl6.}MʅW6EjIʎs~P$%9@R&)24l⠕.HpHFqzEGef<,y 6u[X|p?]V@]ϙvy7s⬳TSSRJ2Da3H0!ebL$irD$ϙyЧtcjC~{Ó{ܙǮXD/]L>{"ScIrʂ$ .& &G#*G}KqZ?yCa;ollE^jYJ{JLjʿ_yk%% ϤҪbǚNWoQ'K}TÅj`T%Y B ܨpd ))1 -:-!EyVI=)rShc"1M4;No3HY"!(@mrրE'cbt,հPa<~v!淋_v2 WrZ^Fn\ >nٗi{jkh;sz-E@%Io%ڰ0;fmFe $Њx*`J z0Aw0 aTȓL$iLx Hǹyh^AC~c d_ƣDžAr\x8ȉs#,g6)mI s֪d*ƘK̊>҈jePТBFtT HG4BSIКSDpJ Ob쟨:Q8}Mk-,vDCe%Bnn8 Qn%'-pTT G0105OZ'W:'4Nq,( fKRtzR9KCEȴ ƠSV K6!HFbGr\cBcIp.3㿋ߗ;|4kEʎFϣl[irdVKN9OCFB/h;A(bsltl`c2lT^rD *oGIld1Ј]L&b^ jcQ 63]N)43n@ũ8L4d) ^`$xKYﴠ=ddB x"^"qDlxt0Sg?N2!GKAbq,"ˆ{Dq;@VVe8L.ـ`DJmT9Dq@MBP3.8yQRd{W`>u+}ġve%ucqQEb,>o+WF{ Q%xPqRO$J1-z\<.^ cPCy Fy/9*+~`mţ 9U=6Vrt;)~&;H*G\&x>x };;E;:橊Js]F&$B aZ1u&Wich0*,3]XLM;jOu42Q@w-H_[i#1HH t)u/)ȷ4%g[{Mr~yNG۸ff7-Fffh|*0xqzl'P$'!Nb >]:/I6(f6-hW8s9k9S\$PB 2^,ZHH|MjKH.-ecW_3}I,t[R^}NDG]+onyg˯vYe9O 3o^7߻~ 0lFy] kn4? ;3+ݿHEA3`5!u% IĪT`i~AQ80^J e?O >:tI/k<ghelsxEI^9_ȣG*D s `AI `]rřzD8 "d[n̸(CV\(ӉR$si%N3!W.R{{q'AU{zzyuG 7.H0ccGMJ*\c"4rfz)<[?vcAꙑzI$XE$愽\P!Ȥ=\9uD/H|ܬ_menR9ە  S RTڳ .KGuW_*y7 M=W8,JcJJIf]!$&:Z"IZpBY"cN+-2fjW>IPNY4BCqC.PYMh̙#D9S)z o'_SA*"iL:>(e|r{C/ }b Dff+Q.S]O4*/ow%j5q-d!Dev]&ix537Ovv>[ϸyI!t-ЦxE<-^u4~k72n$Fbԥks\n6ѿ ~*AVnZ.it$HI3M6菭M, n."ՆȶrAN6 k; ;5Ǯ;,j d\2:6 x2}]*̿#ӻ 1WQzlX-:PđĦYA@EڀQm=5߭Qylnd,eYmm-7t4Bܵk\d"]a!CoVqH ,?n L܇w3,epI8/?+?Ͼxq 'jv}0ZLַEȥfIȸJRm}IA/^׃>g})՘Z/ڄh{ۻqv[(|!.,jUno|Fq|WeǷ7kԎjQwe/~4QhJy#BCҺD77Z{%@/iS-NW{,4J\;3TW13ߟXf?qŞN~_~6{ҥN}ixb4 y-Դ:a&t Eۼ_:_vh?ܰq;_۽Jخݎ=:_R0ibe3PRƖ|]P%8FOm'+ZhtV{u Ta aHT:˴\0lo;XК]ІH \T9EΘ |&|,}_=MK|@y:_6y~lYeA7!k FrmP^ME9d/@[27;- -=e|Ua}UE%%6(>gU d%%qFl^B2iɐ/avs[*SuOG.m<,Hx?*5jg=YF@&UL&ϲj[G l,U; br&b!DQCL>PJ>W9L:Q,bܛ8[qw`ǧɬn(x{튛]rvC^1ž63K} OAHZlZ! kZ-28!3*!v;;xvчZ=eT=J#ө Մ<urԵ8(FE:(\4IQQ" \9Ȭ!5YZc3!RN!P撁qj _;DMCD; HnÊEYCJ&e=Y`gš 1p TTՙĽs8<٘%7^,ɨAHRd\R UAѹ YUA617qhg]r7 쾏`Kw1mLIM)9"Ɂ JQ TV 4甂šAE`}E~? Tء),V:V5 (M!.R SIb06/;ʳ]R5IO6لg $Uuh0fŠ2R@ DZS$V$02OxߩFSb`u95p똦#$"X^}@ `mUDHSlՈ^tW7ْe}\K3,EѫFg2L=DBPG#J w;lNƨߣy6ONu!ZGGB"ٔ'dWS_S!K7BR@'K@6y"5&x7|$%j&:9s6f蜭4Y{˅l oxX?HQHR):Ch emT\]"\|,o,* ElC]H "E-! 8`*D,JZK|t6w{ RЉ&[)s/E)IbM"eKLy6jzؔ8'#TQ5ɇL >imT&8 SqADsĹ!XURZy L_$|"z,P1dDIZ12at![g-B&]|MCvB7EKRc6PBNI%1JEx°Yj)[ hڮ:h_yvƂTU{JR%^fJIi cmff+yvN1s?!{ [{{n/mwC1h(>Q91 óx洍P,BUHmM ʼ&o>_'xPX>Rs"cPj-URJ tAĪ)D0_.e/xK776}{MOv~$8>O,`:9eQ;kY &'hTolQBm!&BKJTrh[l^X܈9wBF6*8KF'zz/ 60m]bZ t@CPlSP yv&Z^9/&y B͗73:'/4'c{Ms;G߽\xef8W5w%6w}dEzKgnmft;q nLm=f" &o/痢઀Rr\Unmf!{PRh3gC5Z\Y wjR T/j\KPOl k04Br .분1kgDno=,"wH@ ìlL6(L ,y%4c(nUL@ P WtcYraxF63;愉D],:h(tIXg5HYqF nEߎR:/Bնd,*Gu=UHn!;,HVVzksP)~1ΣEN=;#jj*مIjLDn5`5!iiAIw+_8&׋T!0:peXxETSdz-tF 1'54k0~ i}E+!Xr܀He_6c*:K(2!B )J_/_sRm^rZeA=0<&.'P|uUkQm\>ʰ·=z2KU}ŤKÏK߻ @Sr29$_Na&K vv66Ņ]e}@׸zkjOnm*dҕ& >Uu#rՈ:ʩ٘,gKWÊH޻.R̟$v@ӓ=YnwLYO6Y5X"g7H)o|ߍ?Rl H|^ֻBĔG=)UOHMCj-uދ){Ms{xű忞L6C1 )uҥ+qWzB>S i߯^ϓXȌb'*3f׺ʎF7jhʺ~ X~7[j8he =|pYkbۏsMzGtۋqFoeM/J{4F/?9 Cs'l c۴?On .7͸ ތ.{HAd 1lHП xdC-& uYE6VyOڗ{tosXcw`XuLNOqM/eۓ8FP|QK+r,W.(mU1^iRĔ3#enTp%~t?,_h `Hj(erPF!W "ʄB-}N+E…Ei'4GέO>JW s)J+UkuΪ-j0"z[l.P ?]گWkf~!6:u3txrB]x@ݫL:ޯqz 1ZGt6If_ Xr7Wsۤ?|ꮊ~ %.DުG`mAk|ꀚ!|#4-7hQ6m`tne|/(uDѦsr5B`v2EE6.eLF+f2{)T|S _tÖ`tigko|B;nKnyd-EKG0Oٜ˗DlηN&+zHgt{!!ͧH_/mlKRvM+(Xŷjh&W B*hyDZ0sz{ JCT*nBV \)!QlTMAn"$ ^Ix :ءWtJ/o>Ic1*Ĥ|^['QOy?iJOJvk r֯$N)#>s;eKHkzȖxY[0Rl>H'![~ly-#і;D-w"[Zz(JJfS=7AXu }If%l^BDDJX gPL$`*..hPJ02ۯ/fc"-&qNS'{Fl0vmw0b~; EhH$'om[;cYdwYXu*5|x0@l:i}VڪoCWO?RnIm{S?>4 料*Ejz( :b%|EY8``1(N58iJ %T8=+C79J2jd*I`pvkv+d2/X֟Xz.%vvm\;žԪsdɻo[p,QQL>DpM0j DO%}Y-&"I Pd Ye d#T"[űyUMTڈgnMDK; )eƂ-zҚCJ9E6"0'um"v6&b]C`_n>+Ŕ9v<)~[Q UM!ҖڐX&gϔZ E%`2T"^s;fZ4׳c_ő I8*bLA灂ϐ19IzGZ $ 2I.**\ڦ |D[I c=l8{뇑!PVdbT[J6K^@YHEԼU+)ƠAơOW.UkB: 'vY l*R JԕJ=a9 ћZZi>:$W=Nz^pȢ=8k@G/B}}$L^UK-!(ĘK~d8:+ñش*daflu9BՂu"K\* W,p%yPk[w[D ?١(_g"r/zpۨO!4֫F'O/9c8SIfMջ;T1֤šhGgr_k|NtQ8ǷwMIlj[D?Y$&l{;Df?w| lq)V$ˠOw2 .3.Ѣqh38׵%bM:?hZ]V fS>{_U:R74C>hMzbhmDMGɳBH!T|(P[xs6`aSlI -a {\ z8ùnseepS)NY1( k]5$|J\0ۈQWhS]+GQ)KQrF&QxFeo]Gw5oUbl2Mj:o0J1:#P%e4Kg5Y(hIL(WToH`4W)NyJ\ar% R5РdYX+&_|CBfnPW'+Fă2$}A69Er!Wjabt٩@5{U+MntH:MFy,"K^{MNұJ1Dh$z; R|io=NmV ]J2(VA`x6ϳFc`fR^7"6cK+{ $yԲ1"=nD,`q[D.mLHָw j`eB5V4鲉d>ʰ(jcxc#mD9a B-P3!Xr&'0+rQXg8t;wG)]n׃B6',)Ǘckߵe!I jIBVT p`U!TO0oEPNiL ށFN={B^HDdQhi  U1@ńq`(?^L|1^R kIŁ $S]^L=k.`j)ì^UpZ )Z$%Z :c:s2AJiٛB>ox<^"?9Hs5oܻϳg9EhTt\|i@Дx,mӇڨ.@3LkYQuN,>f:,ߚoU[D Gy:rr"ߵcJzvï~o/fh Ijp$=ývՙ8:uVkf1.fs>(4V=,%(1 AIZ8Vۆ˗R^/FoKQ4n(̬F *Uϰ*s~צȞ)Ѭ\w2wWk>`inضAwnW77ތ A$ 6&:R(&P:/`Αι7ޞy_-7B#@(}aJL(cP”%7 pɆ܉f;W7{H:*t 'BG<[m>O;/b4^XM +r}C"Fm^Iάt?̞7>-Q`_ 4I".aUIBʢ,;5Sਈ,@sZcAED4@9iU$:n3N}~Ɵo$*z㔗=aO>۽y8_֪:eo>arRS DQXCR$[g፞!=O2=땷]ZoAwx9f~1x+Vfmmo(G7yctIqJki-PD `M\bI-wܼt֋ωp^/!1dDK$$ʢ#9P*>*מIaO&1nؽni4\F{Ժ3YZęJΔ:\/3h֖񰉌OZUg+lԶ"]|.[6u; >+.[n;^w=9} VtރshI@m6JsUu0' W n_[7wwϷs<anrˆ[6Kiݿۻn/|yG-drGON>2?u/O_x#x-鯚%s@m22>uyky;l:T&r] ^ߏ1/ # bq:k g@vǖ{Y6"܀+)z/z!=8P|V.HYgƦ"B%hR*18eAkݤh{5d߲dww Ǎn9CQ֣Ɉ82%0] ɢSJdE',$KhK RIPBʼϯ2J2DVރcEl8NmG6Kiztd)ɽRZ%'K9$awnow@ {ФXJe%C,ΈBr>8`DQet{\b>-/=_)Re+F) `E땱^U:[ ;PkᣮWNs2ޒKpObhh42?{ȍF\rI\6[ ]l{E[,)b+vdѶir=?Zd5YU*X\5-锵ːu^3MȒ2ISN=Ƞ[B&`Q&h$MpduYHr3#2tyXc,m㎧jmXkèG qNVGR:8nĴ'Q4 C#+Ő 8>$2>$,Cc4k 5X2l3_1HN5և>luvE#OՈr`(G8j{S`ڐVTTr"Ot5'OA̔$I -,gLI1* em8$#E.z&D2Fb$OZHpK5`" itHzQ ըGxǗPM)/jdBHt)2sU^܇^<}8w܃ * s7_ -G?6KI;YMtM.C$G)q-Dg4I(YN2w(w;Ace0QYE.Ph8Cȥ:  k2K/M9LJQKFl&ܢә*ܥRZJJ <=9Н^>$Pwi6E1#nύbZ[T'r>[ؤma߬t^uMWmČ ٫dwb MBamQ7;kk;1g,ISwR{W[@Uk9~/CľdA|DrF.bCl'M'eSRDR ܆>C2 W@O 4gM<. ;ń]<)ik zRW;_ʹ}U)VW$ց3/uv3G\?4&E!WSȵZ~5) ƚ`M ZK9R,)ϗggDrRGi<'jeQi<$=n~d!g}t7}W5o'\sXoT7)rmSPN3pQk%zarEkC[~|L'’qZwsrYt|m_2-$-MBJ+K,!H:Ԃ,<9h`$%a/rƣb (2cD$- B&Bl\':kVYMB .U {xarv@~hz?Et_ddz2OKsw<4]m3fH,t.brIL&Z1g<0eR1[RQFQ x4g<;GB\& 4d]DBDřl*2Q 7jA4O1}?Q uTA kW/e%x ' K~qitĀ{#3:,ho 'WQ{i#CzGEC{v0Q94/t g;W<\$1gc!2(%lu̲Uқ@,3-Ɵcy-iy0pDUڏwN_kq$A ~тA}\/> 'MuJh):DKd} i'yo$~x{vUW띴K+kPRצ%zn+(稓|(pz_ RmZM  UQL4~Eo +'9N&a~zK: MC{fIٞImedoDjtɯ6n3I]uک6=i7Z,ײefvqÎMqH 阰*H AO4K єB"Q |Z/qu8PCkn:]x'UqI1‚Z:o-9Ayk M (ω+ *}bZs6 26- ̌}*}W5Zt$!r=V.Ya uv*qS aQdj^~q5:$sBGSE"ɉE{Uòt c\ɐA2!iD6-O Y[% xO0h P8aUd*1ei⋲ә"b"PFka-ҳPD2}'=&P#|wOV5]~Lފ;a38f2ȭqgL4dAC@ GpT0L,2k)U,DuRAzw0r6njCO]?"7Tc/oz[v=~\ׇ-y.n7m(ad^Q[KNF >'TJ7r>ŵ __ۺuAriY੫*k222W\MΎQ`ЎGx2So˾k(QK*&֖IJx$d:gddc\+plܦGG_O.Pus^8ë[IVyPR gՊ++H9런5UiGތM};7?~zp҅rΕáU,N}5rn~TY>}> {a魭U9tVRy/pfI ` D )-5¯ok{@k;¯{גA$r$ 4iq2 7yxvlD :Ea4AdcTܱPC5"Fj1^{< XLn(H S1$k0H6v*z".dguPJ1 H%eAFܝA(s>~OoE?1eW<5ʼ~\0d,)Z4y&[nfiX:MݢK]MjOHۮ.@ 0eg?4WSbWES4ɠ\o9<@}~5qX!76-K%z& :f)}i7[mZ ij:ˆ7[q]S3^ iJ:rSQ@˅ 휪%=--%Lʏ7O-{CwVqr4旲GN8K `M&)eiQqK^ Q{5`GFpG~0p\2w~SWwZY/Iw1 #y5wK] rI|۱Rg-7nK` ,<~keONtYx}:N ő2Ш[b6I\د쇜2/Wg%z;^jܝl,_.̨(5uu'pqyoyi6 lqujIw7ii.܌`)Yw]eo>qMH&[P,8A$Gxo~p!x{v/hmԍvЁ Yxi* ٻ涍%WPw˰PV6-JvW5OK6M*$%[Io$ȡDpʊ f3^EjlT"8͈g1IގNesG.ۜ)k^*K 0'wU{w PfR^=Z^f}+`[*$TɀS Hʤ4w~ ;P 02eSs-', " őȹ8~!ǣH]me> ;E~_ivB>04XJ1ʉBz!7˝D䵊Zόr[GiØoPvUs7[(E.ew =WX첦}G2:Ԛ̾iO 1Lo^vj:"-Ѣ2ޑfw鍛g㣸:W=yGŌP0vY4a#A29Hy]v)@| *{+Vlve1/rs\hWͶgZĴah?|D$a@~ n4NAMPOoЦ邓0y}1_" ER7uw%qدK?_o>logt`^r̗BMa]Lj(V`rMrw89ɋO uXijܺ]de}pnSt?5-xW~]mlE]is}=k`fgځ;'5Bŏ6 YOMb[*yW/ jFIyQ'':ʼnz!0惑j1d&#QHYu2LwZ D佖豉hj iȬ֝-56?㛊!g /J Jpuy=MA~2{j-ډ*%Y__XnRLTfTݫ]N:RtCڜɍ]Py5&UZ?Ku1qPe]zs0ɖJeGJPlfkj7 W޼PRa<movv{➻0v\Ĭ/Rq俥:\ѥ:۟ڢC2)Ӄ/d^+L7ȭ|h_zlsEܜ &m lK\ryJ%RЈgZjCXtFmrj__} e'QEKE!5iTj`ƒJ' KZ݊q0.EfKnie$E(tESS蔖a$yTKt:;9vю-mvr\?N"MOv:pT ?NVؚC{ᨚC#9w8yɄJ"b).4:\v@lȓugYz <[O)[ϝG+#tn[?~ |t\g*t9""}\ť?UuV1WwOIxtW[l)ms@P_ ]$ eK롾;NvC6>SV0ݙE,6wơt a.]S2ɥKL78r޸^2tVnkޮx4*ڂƵhn^N/V>VV񃬬jiexIzlҥLƋdRxIY`tJ_vao%"1Xޒ^ٶFPl{)WڶxR%SWYs`RE`H25$1T\7Q!x{<]nE9̉ރ2n]{bwN5:$c]XZ(P*u((n+>론թA.=kT TcĮT&K)׭ [ٸ.|Uucq,7VA5oaR _fPK#G0M91e %gWqtyDy'G!u`yJQSX m=c'4(G-}LN'ӑ#DaD > cda$! 8 0Ь (gʮ]Hwo> "2cXXH 5[w/1*ƹr ;͓wwR>2ГQC.!"@u"lI%ƀ1PDxIgDFO>Z%*bГR%+LKC&x#32x5sf"Tnd֝Șdl3cW,$XxP,\q ^%fI>їmr3q} F_#J Ssa6h@Iy 1 %9ĵ.Pf9BJM,DcxTH' D{CJGT"`Hf֝mGCdb jg6͌ڴGnL:4:,;'$FQ#a`۠,X]V<%c20C XrRgdbȁQ10[waer{$`<D6?vEDY="nB,VbJ%RF`%Rdh<1ɔ!fD۸f0x9+^W Lc|*Z|ut=\ ;OhR.EvJ$VK/9&. `d,>,=r"K#"8,JBȌ! ,J mqH+ z!4+%ؔQG"`"RSF@0 #` iLD:3֝;ck\-U, 4+ؿ_}INȮYzxkZ{S5~va(lŊcJq ANl`E+8`lBq*SVHJPE%Xydc0E4qXHnŜR6d^ٺ}Y*t ;żuVmzB[ uKNu9OMMTD]Y*ۮdɥ>TٳmM9LzgɅ1Y*+UP88]oReJ}>H~)>r̺jnH-wqԒQnɈ<9+L)c{L,c0YGM,GX5u[9z$yg፩.F Dh{LƖ"4M`{xc$a`h?:ucOT?_@'u'3*P\A?3a;nj.!8_o:logTq^ri @zZ)lp)US25ܝmϊEC\@EY@A ;IB8n6EWy{{zÂy>,/f4p1NQJh]}ucu͙.wAl7F'5Bŏ6f]RҮ[t ,`67U/?kGϩՖ<聉9&J +4D?֋^G'q{VzOʓJjfޛT`ʸdwXHjp:ofTK,NLԈ^ q%D)wG9l} ./Y2+I**x,rȥ$2҈`3QtJVPd`i.~C  i3FP*b,15]80{%1 0HзuꑑɯzAG5K r<1^DbÄ akPpP7ŋ~ѷ,UjѲf!m)m]iL|)[(')R8AX&rHP$"3#-@YD^V/΁=oOj7Ѱ?nSjh$QΒEDht4/C'-mJiWR-3QR񾪥qwlp542]^&d}54/8mjHlN `('e_[d!i^6FYJ)]=FهXb/K䴨C;aGzcTU5ʨΦ<)FH_g'zXb*?,UggO.WfZYw]hY Smn6]|5皽U,|r ڙLI袅)mqٯL~c{̗ghg Eo70wDtp%ClRO9<}߳}SЛ[YDހWB81LZ2nĉHg6z4/'z*usutp8(HVs^:=;z$vĤOΎ8"ԝcS')U'j{2wLwD9{5/@و|1w-d4dy|*]p:@QTU #j*JR-"Q5ӂ!8);H Ir7W򕤅$%3s*Fu굿ĹkFna]Kٗ޼S,Rj}U:SG޲ӗ5\̰jCv݋z J##eG)+(Yh\{/+t˝BKarUYE%k)/\8VZ:4KX'ڽ†ۼmnݭ:Tvbs]_NL hu7= ϷcVv[,#3'JP(jsU#`AqɆ\+~#KS#mRZ ESM |E ZtN)0D؂'n ^˳;>;=#յ'k__u80*c>%GYZ؛#Kx\pENy/5w7!t,"Q عRbx}(P*:4N&l}v)VSt͎*Ȗbu^Y傉 ^@ʴQedIنFEbF/WʩBl.SK45EL)bbBO9Q-oGNqWFpyG\/>xvz +*mKVYbHYi3m#D&eB2Tcp{˝rw0D܌C,‘PW닪Ėi/c*z͙R2&'IҖI,`ZY/$H&uBUR^qq mSR0 \ckc"x=M&=߿3[ϻEQ[n`K1ߟ0Q\J6K(lX(hxjdbضdqٛ&뗢PYCaqR!g)4 fSlՖ ) ˘B޹M8M.IvCj' (*gȷ_4B s~GjNڞL'6:MOyj{Vn|.,Tlmqsk޽"BXAjbN j{knE] DM"LX}4ExM{R UiO #B0Lcf<'AIfM_uL!Z EA/ϛcM]pK߲LEKGD?YlHo6y&+MAc )@+˩_ r̮PTrYξ]=.&LLC tJ~o g TNbLC[gƮqhlˋ㾣Wy8զp/O3(j-mvzRF?3A0W`ΐP:>?=.yLusZ=^7vsgi'%υ@? +`_av$5{9ꋴ M0ПPҸ:6SEo;rZPsRDnUNgagv 32UUTGZ;C]bVo5$O1,ZM]T;JwDI7*&R"&O& )J$Ի5)Gr_a$džV݇+,2p5O:\AJ%pUU+UpOϠ`tIS%J0?^ _ٳ5o:5J:t0O'G\УYvyȖP@\hDG> rGDžeI- 鍆8Y/_]{}81Tmݳo[ѓ!>@:r`7cxut~Ѫn%ntR^ &ZtϹ2 ۻW{|fi_o.7(9^yӏ1R?iWR? \2+l8VOVlݰ m[lae2249ysOaO5q{*LIkw>lR잩}L3Rs̰Y.z3u|3x^^2&H8ǽ(\4?|w-73υ%Yـj;d!#wѹpܝn2tV{JZUEQ=».><]ɬs'ylQ4RǫVZ,bSJJ @C ,})cN /TFJ] 57ѶCbhY{SL^RRƤ!pU5"K#ӎ)Bzji{t3*DfpjmY tLHR$ bºll h^zR>%tYf>*!dҡa@&M*C; JX1U@˿Oe`aUF\+|uEIӯԃYH%"10b!h goŻ. vTh&+GhyJZ:H>٢ӱ8ȅb?n U%D!R18' \I 2cE՞E W}+%(dbgQ<%Nw޶y3TKq@Ze~I(B`ZbsQ;l$ {iTXؓRONI3˄1$%Ef+kڵbV]tZ$lȣ P #+ .D; h{(!xu(+Jq%PyΕu²! H9"|FbkV`Qn

;@*X{.+ @KE405n<&#a ggܸ IEƒcIĄ >-uZբ+tm XB(wPҧ6eC^5A]tȻK|lC0 V״i@HʠvoPJu-m}Jh v$inj89@1!Bg;[m` ^5M,ZKt6/C΃7BsD"4)CgO4ǮvmBg$14Qں.&HdyeLG.#wGzgUFW7E^eXEH[,q !HƆtY^"ϝ 7hV ^kb4IxdF+ j@f%޲Hm@&gKK޲(6" 2{Mw $a:J` آ}A4J ` \ mrvnsn9mʛ6v9]:L%?o`0uQ=IFOBw%zͤLw{'ѵ,Z՚:EZSR]Br$>yh4vN ƤYB͟oo Ar2#I, wCDyDCrhSl3FsCtyBCKv7ŹE^;%F.;TP=`e A) @)C"Gɰ%6v+dOcW=b\QD4cA\$\q[9,N!3U@ #&B)J: iȣ(CǰG=u @ 3IѪ.c%94(hczn+vFO -rzPHJ Smu21 D "TN: tuZbTsBT4: 6[ 09>Kn5j!@6(5q,gT@: Z6:#i /D\=č#ȭ9>dN{=qzu{ٔSipUۛa4DHD$mѱ4&ɨ"d$-8DԅK{P@.FipBY~B:COW4"y0B]So`j?ʯzBz-nXoB{$ . @&b .!mu}Zw$/<>f5"Zn|_fszYzam6myǭW 3,yBMN lHp  ߰耜@6oN v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'`dMMN :r@@ƃt N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b':F^- ]pe=N UnN Ԟ@ dg3 N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@cw@ {,NGNbx\'nhy'n(HCvS9] @\Gu:>]Tza'; N v@b'; N v@b'; N v@b'; N v@b'; N v@:'K@֫g'sZjJחrwj[\aєd\w=%mVҲq K4Z"`jz] ]8z%PZ᙮)a*+\-p4atE(75e:)SS25tEpSkã+c"` j:9v"U*+֪uEpu5{ڇOW+g5]L\U ]Z?vBY]$]72l,y0=\ãEPn~=|dXc*tyDV =&O~|uiFm8FFVk0V[bVzR|: *p,<7C{E?stcjPmү]!O.,gnʞ7~N~eqjx|9Y2} +e.jh>~r<Rv@(PјlMh}(AmUXϬ :j vczvPj~u!8"Ũ'!BW֎t *z'r!!ggq1 P2 V~vM.gEKg1xY. z<-Md>7t7:oZ5vr1ټs/TMruᄀ߽OM?mڂntB!cՓug i9+>EE/U8 i9=QlrY~MיW8uQOl.ljslWv_{;i᪛.+b)ι+j콭FEή̒К?'>RݲݱpP?vNhCW8jtv+tuצrtac:ЕhZjZS ]ڇδOW2:Dثz"m?]eLWLa^GS]\WM2HhC;]Aҕ &8]]YڸHUCW"qɠ VHtutE3VjE<WEuۼ*몒`6/&VBӄ6PZa4"RJ>zp䨺 {԰{0foe׶Mjޚ A%[2+b7x؈E/rY˷US5yqvcKH޷*d){hƤ蚘r(csjY>Z=m_ Pit"6m#x)1%܃RZTL<\khch`F 5fgI:̓i7"]i++C-tEhu;]J+>lz,4#X4`k[]4;uj7磠+]y6CEt"FBWֹ J(DMtEcpZz-tEh;] T uЕZ9v"&2] ]Y&UbtE(5'HWN8v u"ьRJtutxr5éHujͩ^Gmn_nrD9é-q%'Z>޹dg\M2iRIf7|/w$Kdߟ O W̿y?R{}կaMm 7/xp`<JQ٢-g /f//?z;%fOg곝1}њ."| uo8Ѕg٬=ي?3"T5nb|]u'?7+ᄍ.ٴY9](Fɤu''ϝt^']nm4$/Z;6"pN_nܯwJV IkZ&'U=勤;UP:;PZ j(*URE,\'BJo(#áxvY+򟧍M]àI>=OZ BeD\Fo_9/NN ʾ28\6%C:qu1v6mJV//X"__^!)WLOg):zKZޫl9}]1fL-U6,O޵񜽞v%uemj*Rӻ|ָob$=g\Lom :^tB>#}׊ujiSzfau3101<1M%k j{xmz)]֝]h[ $ƜWbuw/2 *J^+ΩIߵҷ8sUL31 ljavL GɇOjK(ǻln ܁+n i 5;c!c4YB֮6yoU.X)Ч$R"iY<{a멛Q?_SV8yd樾c_bo)e/FX0^hC',Lu&Q:tַŖ>{[@bcvEB@I/ĩ/zY._gNJMI,}S]ν1 ce}&T gzi|xbI{%~v*oVNV?O/ns?^- _^ ^tuЫœzҧ|qjuv&˹ɿ? ы4"'z W;z 7Lϯi]{o#Ǒ*]H Cdj^dDjC5eW;lktfہ>V鴬f|Y|K/X/M/kWn5+hih<0+:l{(l?`)4w\3F9]z RAy J`ۦOzլ`⠺M+E){k ޖQFBzjr?ͤhRZU{ɥX݅-Ԟګy|9{an .mI/ZXw% RHGSCRm(/Wts]hKUӄ}`c#aj DE_d\L'jݣ*ZimV戔X}$a0VH˅(}p9}^l.C,?=[2$˓)]O^/wyyJ f=g^rĬ|Ы~5m(,s] i"?j/+ ]A# r =P  %2ʹڑB~xMXO"KF9]h4#d3;$8Qp4R)mtϩ,Lл<ZaF-]jwj6l~ítNVFmO)uɈްAm佈QI1oc^IE0̷6])Mnx[m3n9^FTα.?r Q%::Qo5+U@ƛVe?S!=#|ݫp-¸K[29+_s( :Nޅ n3M^#S"9!<,Sr&.x$y|tx0 QA̘m&a^8tCRhT]E)sԵr׊y[8qs,*rHS>tbGw:5Yv9rT)nh(s*i&cɨ-ٮdm :tWm*FUcLJ|ơBXsnVA+czgnWzDr&?Z 45(LOhgz^<M~scu_><ї;*JqN_BXtgDi0$5j%J-$dtJ+r9W"ĹqO|v X42>"[J1l+=)NIŨu*2vg2SKako_z^څ|PRb0%@P$ A!x9\'|+ +?Sd8 OCJIǜӘ Ĝ fJH21d2x"suA߷c"qIkwB0AJ+.3/D񰨤ETBʡJG';UO.=nt]D( eHU,X[R X%r.G"jʝ;J/zr,왅|E~oR^*ѡWDйTNbzY^1y"'W08 wtF!rG5*NЛIQmQiw*ǥkCPCN&l`K'0]?|PkTXk}1DH[,ukJ1]6:2ǝF>"fЌ%8W>=++>@&hIimSF36Y,$$XuP"9#S!9<2{r)Ua7,vYj~Trf[~vqreas5ig4O*^HX2xyD^Em[\xs=p@L2en: T9:v^ig}||=Mgo5zlK՘Ap Y$`Y*aXEQNZ}*8BZc6&TQ&ĠsdF*b5eȊDN0i8ْU9H:P;ŽwAS m-gȅ(AZ8%Q\RI(!0gIutv~ izLɾݖT]y-zR:OKٳ-(V*2hEEJAh3@KJjɸy ϑ*h,xܜ=5(, #6!t+ Y-q1 Xeʏ A^g!*K Ŭ K eK.˳ Lr#hXd: $V=fa܉_'xRM;b!gMczaʕ2,\R鍓Lf 0魇8#uHd&)7a1&]rJ4PZ3d@L>񀨢 *t%߽S\dKϜcL7UJŝ@'ōN &81,N;I!uKot4l.u?įl<i<φwN:Uf*36ѤӫԉZ0M/Ygݧ|x4qV X(Hx"?P؄Λ5=aMfnafXܚrj ŌHһ}nmҨ"G=>~γ&_8<{F)yNڴz{Uͪ͐L}|H$ų7)=3Ǹ)g0UܞMpߎXɎ}|u(j4pr1tKُZRT Qpq2~A"6 C*{~e6Rt0)C۫> 4.δ%E$gT/t9+ս%в+1̌̏BQPGhJ':Mz^(lyWV$uH|ƪ47;.Oehg'oo2 ϢQ+t8 gFc2ggQt2^͍#,:G^.dhQiKvrJ IgF+T&DONGU3F*r~Ul,x׉' rkwaR+ HukjR`ɪ}n*m~# <>l:u0X,xVe<`Lڑ%/eEji=HBzHP}YdLP0C2`=dZJ%/u&=ǹCxu2hwQH8awz} e yدf_^/;cR iƧK>K ]vg:q.fL>e]]ϋ_3}=d~ʪ#?W7|]>xv}e\P~/w;dսIdenHonmU: _z,EGu hH`y۷UZ^ fY fƌӬҚ bO7Wˢ٪^İno|T]~Z zPmNǬ&gE|w!\_5w?{ 7_@/ɿe3A W-_t͟fW?DQ>nf(|׿UCu&]W^\.9H,^yc=|\csyZ$HPޟy:oApE 6˵&\Z%F+VY"&+O W,؞ٺBK?"jQ\bK'ف*WW(ڂc=Ws}pU*Sp5A\I^7u(\\ X6cYvip pElpErzǎ+@邫 3 vz H1v\J) &+ XW$a>kW,fNj3*UqEߺW,C䢀\pj+V$q dIW axhӗ>|dz߮{*ơAKH!(!s~H%gXF,=rnU뮪mWo{YW~e!{Hg[zv=bFn:]C ᫠'Ni& 0I%ȓ#JrM60vOT")zrN&[t6""\Z=XQWĕWByH0ZW,7(RXnU.*AS[gU?{IrOC\Aձ]xo2(+n]]."F+VUq%Qh+2 6XWɱUz_p5A\)TcXj#+˵ H-5v\Ji &+DkMNkW$ءW,׈\pj;H9Jka$d+lW,+V{/TZ]p5A\M煮֮H0 X[Wr;CՄpPTR̀`/p9Sw;ԶC1-Za u>R[@p ,XcX-[Sht3 6=/m."NU*3E\yi[Ϫ E XKg~>X){^ĕs)ni ?Ol5A _v-ͯt%=Y?cH/嵞Տfaq4.<{}[=wɏyݯ_+]ev;?n}4M7f_ { h}l~J쫆zoLW+@is_6TICn8uQHmuQ7Z*ܡjǷ5M ^[򼾾Jtt=d>¥''Aeه?Qӳ鬞!nhڝṋYck]-qf{ݟkP3m3^_ZO ><)ϿK:Ѻ~rϝgZ{&SJ7oTT;x0#\ºlprϽK-H;XrWĕiHW"\\ruf츒WW(uFb"՞kJ- &+t#8K0 lpr}VZ?zgT*: STW,rWֽtU"\MWi<*f/UWJ{6;Ϫi.'L`m6fiRkϕ_JY)bʭW0|.E{ ,X$P5Fo9XdFŸzs,٫V)#MNz$x=kB f [ќģq^xAqEOkfԾT"(){oLN!$*sf\^r]>Zw^*moWvp?{ f\ySk7*5 WTձ]/Fe+T6bhr=W^*("+DFb˵XW֫T#/: 6#\`d6b::;P]p5A\i0rz V3rQ+Vkab\MWŜpEJ X-~Ub]MWNYc:GL%K{~rWh7v\W zJupsk],V)Jٖ\gK+ ˵2\Z)`b[oJ*rV|F"JlprWV޺bj" W$X XWր;X‚ xe2 Ǻb*g՞kb]MW-֮H X.f+VkF+VRqe;; HWq*M&+gVNRK<8w/9v:,x੓aٻ8WF⑪֭F,aaXgDpɡl%ϩѐkMy%@G[S'ȇB,?T5i5;BYH`ͬJfvZb2.0- O} ~PvKqr 3DWrf1Z~J/t*:,`gbja)t5ﯧVRjtz~􁿇OiApOKWCk=3u+zk +OiapBWStVո934Fܧ>VHUy$`bjQW@&VvTvb]HnP~y`{?bX&,||xhq'z#ϡߏ~>hK HWm]כr~o_5`d؝ؒ ޴ Tjv9ZG*7⸻+:ůBNۇ,>P}B{ eovuoDm.קMRQwWmg֫>eI1Y(+ܚG~GS:4R5f1:"" )jJ% VmWե?y?>ij5IVǕ+Zߺ(1&XGxvL*k5V 9Y D;2D[KBMP![%ʵ`2iӨwMs"gqh1b$B.w!jwR!;\VVӹfPRܼܴMM#$JP{]c,fLc7iT34f"Ɔr1 Eg-b/9%|n+ Ql5}ՉSգ)fU{mpGD5j2աL2:玩( !ƪbJ{'qQ~B,QsZtjdShK kmߠ'1X0!46}9* X -$ڨJC_ZTH,6pD\m!jTNzNӬ}P)>]0֨\S>EC:VX)3HT'*H #-Ġ;*T{j]j};aF%~[(_ "M l]`9Z[b@EE{N4uh;GmZq66*AV*.:/@28gҥ[ǚPX[]Qt%OBIY`(. jhlI0[+3\mpP$XMiޚu膵Ոc UhSP%vtZɷ6dJ)1N-` \:WRP-+5VdfC$8pqN +QIwP_ TT!z⁳f2 VS 4+dW$X;.jћ!ՠB+7(c:o9a ` B€(29{iVnQEgJ75gAx`Y kG !.A f})q488)` .QMe8Q4iZg9Qr\ܡ;=bF]bmDs"h(ؼ*Ca9i #NXK8?Xm\˚tq:no?z*jL|m@ǀ1*4[P dW%Sh+J2BQCTXz :.) >@&=Zt^S`<ِm 3:]k=h^{KH`e YNDx4 oїY,TG7`ZDbݱjxʶMR,|Y+EP;> Ddmrsv`dTbrѡ2X6o\bM=%TA>C^s` Ѩ/wK|̡clGsD'SaʠvaKp-ѧGk!3Ўe>k^FWHPvlyV 1c쨣h2<~ 3 e#kv(q5ncP:LMVƅJL pyP*ZxwGd ת_ a¬2,J 1tӨlUWh!0c;;Qb%[j^ n_YGنI5Z2gV޺3 j[U #\> 2ZYn$7>Fo qmGkhN.OrW @u,|ֻiN7ȹn8b`1ҍ;JN=)*-z סl/]} κjUaв ki/zW&3RO34%0LJKFD=45QpcJ C\1րn ~ i8 sfTS..8 bC1+%q݂H284ubMWKj$tvp xwP!w9/,0H5[F远^ܼl[w{bQp)as5#5^qێE7[>[e"i x]./X{%~jS%4In6? B7ηs O~{<8Yjgz^\~WD9el+\oj<ӽo~fOnhuswt<v뇶]Og8ݹoW=9:Fh*'FX__{"\"xI 4^@/1 ^@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $IiZRȩr@kZ~8P: $$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I II&̛`$^L(x'$^`9JH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ b@<%8:$Ѐ@ 'sO H%&h\$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I IIH@$$$ $I II$>Zzzo~X}j:z\P/pޥzw*W@`~Ќ%bcrKK .AgF! .үYX]0bjv)t'H 077mQ ay fOV%~ww'ؕEVvu^kVz2'_{Ut7~y{Mn>gz{z;Q+U:yn]l9 dvJVs,P77fJGr\ke:eV͢KmajRp^oWW?^\ໃ|XTrXu T]lӚ먬_C 9xSK?btÀRtx/<3q^nG2Y ] ;] N%Ucjn1tQ/@k=] Nqs+ŧ ٻ6%Wcv"&Irl}hS!e+V Y$%CkhM(LOMuwU=U>3\Ʈ1 W;qk j7.h\JupS)'WȰ2'WȮW-%p\2 +B 2kpki;\!vp+J8!)'W.=g}ݸ\P+` y2p쪓[FI 쬫 WRiFOɺX9tڑ]AO[Wȥ\@RP}J[XY{:[XʶrtpX[KhI B.;BS+ژ^~k Ki .rTp‡y Xkb2WKz_ h}1_*f:PAH[\).A?EAg4U]j!A+"aF-i,Bp (1dC&YkInb@,˜/Koaұ̌`^'s,,Ebfȟs_gUa]`r L}Ȋ# HZݦj9W#yGeҔ֊9UKDqQ:2F %14V&;!SkA$?']GvK? 'WȰ'WȮWF+2䲋L}pe5w{Bp Bv-9n m+䒲Oz}F8ڱI~ybƍKKY1 \}֍9xJXYRzt]^]Q*p4E" G4QscTK:UDAn1?~52~+q;:ӿ_\zJA\)b4Ývwtx09IxxAyox VI'IFRN""/r@>Q,:dܿ[=׃FS7@{5{ #ey"]ډaŞ{ݍ]n܊gUܑ˖V[ݭN=5S+`Kr2pjy*pvB.c&k3Z3S18՜WQimny8ƴ'[<خNQQ^R7cgTP/|=Մ> ^_@ IEp6~ M{fGjkQ`}e["1БӴ='Հh@?MpCOpMkdfO#񥸽1X| Z@,Jk~18?P ;JESQ K+ՒxnR [Kؿ^;:jJoG=<j`%EcZV%i6R9+;d.Io[.\=:[^aEI*:{U̮k7ZY_^ JǛɼ]kIr>ly_ qٿM+Fq;= C|^Vy~?< KɅ0 n@*| ok!#s:&I0PgI Xp\G!2ǹᡠ)C r-0t H` 0pM5W^1# 6E*D24bp0RLrr$M<՞1傺y>6!o3K>\=ȭVaN}3Gp_S i7v0;W_,?ts}tMJ^>a }8ե\os6>JѰ*@xÛj gg.Xe&,ibL501 #,=,3""Lλw7A?ܽ[.~K }zs%<(.{S/pi-d27QebV FWF #'ӈ}7^QK/@ _ҕIg xۻI( ~;^Ar4xѫ uNaRF_[1cI~[-j^>+33ZE61e7o;" c[X0쪒e X% +O|QUX3b:UDKJ[(7e\q:!zp?־[7o|}t圑W&ǰiz}zZJݠ={0\*6Ox 0jDFv (YV0 %F Gjgs\ Hle̛2$.@ɂdrSj.?G{J%ɠ\% w.z$iwʌ{l&q5򜮚k.HaGAjXuzW 1ŗ-򮜤X=Kub R'):2 c[i8uh6,yz:qVl,a!-Tz\؏6i[Sĕ)+ݖeM7hbߠw_RYԣ~P|qsbc"&ǵA)48adU]q'e}բ90uvf;oa>ߦ>o]nݠ=z:u0>LN5 쟂_/g,nn -"ďdmMf/Qr0E)xPLlB#ɉs%Sݩ3͠\graԜPrd+jTr`WZJȥ]X*rBS;.Н}LZPN<9Bk A.r` 4[V@H}#=e6X#\ i n{}O leRLs"J‚!Q 51g*]fCV:(4gLn $tq)6l22v 4^ @D h\XgL! q#?}<94c+xfrpϚ/w`$w&u]Cob=38L&lk`kr-Ᾰ^'OO ,ͅ"DZXfx)/yshxOFf~5bG2?ypj)e23 WdԊdS:'}H)7TxU+rVzZ7FvbNfmTy& ^Y؜)O\?{FOidx1p].p/;y8 x)+c)q<[l],;EirLHl6]Ub8Π̺k։E1!zd&LYX\EmB.`w9] ސxd \%m+Ja2XJ]Gt6gz|3O˦wV4{TbGe[xE,_Y{~ӓfmtI}sy iA6$5$|1(TQYz=!k:$Wgw"sv3א~4r !k$Ygbf؊Y^HཱིP=mPQ8%q^}*&ZH7^h0vuѼ5!S{2<rkNjW?{7FWhCLm?7|-qߖ1m8k3Tt2B ì7年E.Ez3DeG|PC%tPoA*-cC>Z&foMdG,,i::ԏ(mf-i%eB41EMR0YմYSN 0ziơdNK凌ݞ-!:@$t-?8 cpE$IR%{ug˺Ξ>5=D ͦnx-z xΖ9MucSՉ!K.i^DȊlBRAHeR7_u`YѲp-D09)kHY9Vۆ;11ҲFP`E X5j;(  9hxQjx\&1z+,VH2" EU0YTGL%q/$QX F`*V[O *^ZE,5 q go}HQN1[E:~%1rJ` <䲓~GD*C$2dNW/?i*ĽKهς' 1 }՝ݤq.GꜢfry>N{IF[~]Odz.w?z(Fz|?H ]Q\]F F~NR[.{|=5uR)8DqZ@b[p`Ѭ{t=Oz_i:߶ kbHWQ5@AO,gm}fS֩tytdhqA 4]qv&߷UtKï~Uլ{ BR={ڣ>^uqBwu-NέNdBd2e }3nӱͶ$eHBP $| kqn|+`9|2>߫Ѵ i/_!VVRѶh_ZٳR<Nj.F25NBMt,3TůtdUojqيyr5l,f4 ?D k &2J䎗MN */+pi׿5W4{'ik|ώ=Tooiϭoh6~"E2:DTx DGela<~8Ksޞ-&^#Ϫ ( *ˮԇ~Y௵:&rHNGU3Ajr~M}< 8*f[j*fzZ*fh0^1RھbOt;d;G%3!Qr$P&~Q8mJv}d oxx4Bq&DeRLfA%=HYBu2FvI:/9~ AFSw0b匮ɨ"cM6$[ Ub,e԰d,~ B}Yw(LA1wIRu,bHD)W_}p!z[t-XE+i :U2"P.ÿz8ơJ2 Zg Aa궔% DrE'=QZB)TEmP %c46>.P!Shzs^]Ju^{l~M-j`w>+"/;w=}n,1_.j ;ȭC:T 땛0T* ѹuíͬq&;:]ّ$2nmܲ\Zvƻ;mJP>oov=]y Ope[|K\Lt߽.޷yN}=mzsOCQ0lY}seܳȲIF*4@gYAn#((..+H2sSL>DS2*121ES)}LL~[sp +AP 2%d5`LJ0uDXB} * ک^T7ҰffQ2d"YI4sVԱv6݊J7Ryw:A?LLnf|UϝZ=옆J1"dq!&#ZUP*)vׁIOɹ(jY"\. !#-]):XFRTזq\3ҙ-L2c[z[nSEƧ4rOnUL?Y& d[lWӣs6XL l,$td~xN7@|A:θn#T^J@ T5`l2-Eg[ L)`u`WmxW(XΤP [mv`&p^JFd)n,L`U +'gb9ujYlHѡ̐$% ˶&HFr0pUY$&dC5;n{ؒz2S5b;C-""qkJHs pZ[&i׊9!%[D CwRc5scB c%dUv&昜&CFZ#)1VBRZcpv[ W.NKLJر].vqk5mՓlOc@GY"%#찷Oav&Cӱ=4'0awmcs-~|G6Ԣ RR4^N& :w`j>Lz x'w#wTGGGU&tȔQ!Y)J- zl1MRV@|f (f'ț"5R _k GG%((]讆@15K,he>|N'nfg6aEʊDONUmGLb^u%{RN~|cRA]K7Nh DP,_R(˿܃H'&3[ѳ8ZN2)!dI9DP@P43LYt(B)RI9kKEd{fAU^l-ZBQ(QSNJ2Ӊu6HmQ}pzYyyybq~tTW?<7?)@#xDeX!XxJ'\'A!Z^y;Q9zt|6mڃ aMvghp{9|z9? Cg.4 :+O:ICLB$d"YF 9C&{d>IFVm{% Zn>~^~qa1i0WȚ4L]l3\'ë!V}ͻh|;Vu{[n 罏AyinŢJIÛĖiKZ6pe} ^l{b2"@ZrWF]4ӛa"+[dH5DnE$T@JA׏fDܞ^DzXêꑜ8+y2a)! A*ָWk;)^Wk}u{_絹Uh.sQnevt~6y9fк om.hzÔcQ/c~ Ho{\%S)Qm~L._s\~Py9k/!QE=ΊwmF){cƛ!ͥ*UJS.^%ziQ+R%RҐ#j( -[4t?4,C]M2<mX'ǎEaOf$uS?B},Zn;i0'WEacN֞ \UW_ \޼dLjnQ WR0p0m&%ay\~ƚl~V^C͢]-YZs9x~V|k߭wVՋXQPUV3V.2Y`>*w}l^tG9T?aY4SOPϫ/&?͉5(]dc\:<MAx0T@jq2x"Ђ^cp L%,/u'_}k\ 4^w_ 0ROnG(6ƅ@]QOq]Wҵַ @0㊋yYk%* &o#S`c6*sTfHt[n3Sc;?ڪOް1~yMk*7;]o.r~M=}9O[LbQ}6CzjKnzbX=q8VO'cızX=qx<=^Y汱zX=q8VO'cızX=ko =Ha%ɐqhNS0t2NҊ q`\z83VO'e~cız!1+'AX=q8UcızX=q Rl#Qlm,^Lojs#X,^4ÛC(+.ˑEǂMŗՏʞUQA(mt%,=9uQTV 1;WApY7:ChJ1g 2AVyi@Tǜ\لb)^I$:%Yf6葃yo9b"r.{gQlgIo'ԣ 燒W_p/\L+OUaq\~M(z lXZ|Y,Ӈ(WdfnFZ&Zk6:+SE$,UDpEKR\֍uuj[ݛ@jYQ˪bwu7 o7Z|jn/~rFM[G]J4׏n[MnB[jSr{=Tc۽SA2۰ 4`8rn*etzX1^e)=+XPgWY)\2)\ʮAM:fmB"DMQ٠CP"@G.5<( )w1e;eo!eYy^Zs:b@GM,yemK1IcrNg|RN !|Nf'P f< ېd6@%/br"ГhA^kfBώڛ:5#u'tvbިzże{ӆM=|G[WB`dyA%0dy5RdewՁch01=)*=iYKNǹD Fُ*aaoq(PXxT,ZY~,q˾ŎЋ/-;ՠrlբMqL+}V9x>{GO%sLZ~'%cF^ dJqSI\  lA!!EF)cf*[Cʌ0gMq# jf60x2w)c AyKH RVL9ru"Gzo)Jє`1Xo$Gʐ)2$x n1D&(  W1@Iu~]IP78gDGDYyn p͌S֚0/Mg936]34Rn$YiQg|`RIqMHI 4 |NrT^'\\KCJE3.G\ee^Q[2ǹQdQ+\+I>d)bFx \ faxx}C9wF /$uJ¢-/,Z*=/ wvSr,F9|۪\r9ǭ۹s1L*%N1cRv^| K`dlP_|٫NXPhSn|L.CP̅U¡H@&2q.3$Got_y*HRJCbHZlw_=^.vouW4_M8{{ʴgVg?<ݴ+ŨPDŽYʘd2  tD,+`<%pD(ܘ #0<0P,3:8b3)YdU2+aQ$18[˱E>hأ Ϋ ~ZP$;혟qKZ/~ҮQ-ib:YQIBEaC%raQ8E,&5E*N98 4x SAAQߙWp]=K}xq]t)84Q!b1@io@@*id0E 6!0~V0㊋yYk%* &o#S`c6*sTfHtړ)3iyg&-i65i=Dwb E r~^8o'1}/&%VHCj~-Ez9]g0iY|ܾɊ_"eMJ\Ki9iri-d9,tmP^Kh&k  E¤u[ғ ̜붛n0p2]MqZh_^=$P]lfggfb ˈD\LyRV/V;Jՙη~AW%^_}=uhsغyTU8Nj5ɶ=VuCma7/{X4v[,]\>|ǏWLܘ )G>AE>w;]߿ ?.dIJݬ=]eT~Xv4+92~ Ћ.D>:uӬT]on#Xz!`սdiOx{Ozl1Q~cE'|8yF: }\}bSEPkvBsnH.8oI5mRՍMZeeCEEyWW*^w;'X\p=,>{fPLn.EEuM&CTml2,[I}y6AMx3Mwskenv+j7"W6 n[nJ^1JM7)M)OFOTCt;fbM16?wmeGdOPomG,(?? .=ol#?.p)զUj ̶tP3s:lҽFmb7m$d2椡2{:s_:˳`w&}c.(89|s|.OJMn74Pc1Ga9Q-sF-# 0!:x6(k'{Y+] *>RhÖ] |ÓOd.H Y .r LQ>"`&z^ϪEZ~LEݾlτ7./o=d.*] Y2ԕ$S62U2U:VqO_K]?`糛.WCҴ<(ǽ]mq7ebf}q{@%JfJAVx!  Xf.%BPLÎG7[.I v>AcW@;0ϔ)= V]wyd  'c5;c {щ nщ,:K{^ًVZl%=h,q@u2)!e5FD0,*YSJцNzL`ƴAy QRh {e@>oBhrDNiHB`AǠeI,(: enIիuɂFks1NcOKfVʠx-JYE$R8 9yB9 E!{GsXѼ')΢' kId-})2 &kPQ1)cvxˮ.,dȉ^(DFaIZ[*AfII|,gj[(vj)p@RW$:(+Hٳ,%JL oN?d'a)v30UYl "o32\r6$c۰w 'J-7ik-kmHnt:?`.R {2.qTgJ#ECb@5lKAu"gI}Uчo"6,/;&>>((/4z֣k$t1j@I>cyk+P 7aY}eK|Q,8<;7Xnj5i.6gѬUt$iS ʤ*{H\Uf߫s:Y/OǟH/O wl`-V:_cYPKy4Ё.̵uԈL^]J42X}Ϛ{G]l͍>u_|9pdG%w&Rڟ UU%d މRxZ(pPS1>lb >iM'Lk#A1`FE+=1 팜/X Rrעs&4[*B,&xi|;Zh`Iޘ{!cN2R˝85uʐR=C[ m\,;ES̶fKTV=z,@3fIlYv$N悔dR;IiRE EJ3 01L!9􄍲IHEIN!HtJ˩d3rE$(D sk YxRAOg܍ICW4W xJ'_OWw% nMC0MSY]|<߿t?E`k"0P+cKx?~^.S][='[=d A(L)E^`@Y!qR+eNY qN&SZ0+~aJTZػK?'t՝b{է~b!VH$1+z=xk@SdJe<*3d t wV%˹qIVg>qI瘜Z8XׁnsY_OyjM[QhMXeB-˥ Q)$R5zmkzkK!uFZY;T I6ƈ 3W>xL'IPLN@q’by9qȭwA$4BjM,q!J{5MV3@&Vq9ZȹҦlIm2;Ca){9&Zx J*d!\I{e2ZbZZ;db%^Vm*K,dyKb B$4.B71)#`Qk4$"9@ Ш~=T"=J. wKX_<>no5 T<+π+P6'lEԪMYZnI\ςv> =?t"],I"dMܲ-R&eK*2 3 ' 3niqX!EG w%K) h#)aPZ]>R;Mqɳ1&,*N ||X2+nhV!l5? cZVg,ޗ4Ɵ4-/mޅ2(ązyB}O|F)'Ua5dIRJdDU{:ެ̈{ܐWn4@SMޑVžҩt̝זY;rX.1[bJ0DZG.JXA+},z" Spuڌ]$1r >/=qQьh,{pp+χ`E_G}ii\IM!l]_$8xC.1𐖽H;ubYbb;#Q(5,upMۖov# |[CGlpl<>䬅Jp֥0ȥ!$PmEo; ʲ9 %`]TN[G ZK%<̠FKQS"nз4& KƝ) Q! s,FOF -p+œ*Mx (FGRw87!HVW#)ZFa),zUj1g4d΁%QIlj+ H]dıgkuWL"Jj7م)2#L)m1j\+;='[ծ N''^4h}ydVA$*(^meldlS%d mA;܁_Qk"زS躐_MB-]/t]D~Qp ]&4Sp$;xcɆ`U-wǐy}mGT\I[wVw+{2)eB.>K9+L&uhY%mBJ,JLv0/tqd~iWL[ey?s Z߳R{!K AAlR *D=ey,24rQрP ;ϵDg CBCsbwIijɂ},Fm -F>Mzq[5a=Ou)24bH,T6h]"! F:Sr h#=(W<(t~{Q竃 kN4O. ]KU űšK4dDE͉{%fcT"P{RP^fd=Nw}#W/80tQE][v74<^B^ 9]gp_b&J49Y@X J_No@ iT`Q@ Wۨm w\q9O9kYŘAmd lFe,N{53y}s!~ުpKo6iv0"jػ6$Wquq{VU;Z?nnMB>e+/ MGw횁Va@>=a vg'C~,{]rK5S)mkB=xT[bƆƅNFfw:˫v <8Ta09‘4eH"hЏ2@k6}dg {J]e%Jϙ`!Ȑ 2q c''=aWG󖉔2xOqe(N b Bj&cZB~@epݬT1).!Qx1"2K1{h+*`1AFM&G u 'Ӆ;fw"{۩vfhjiչzq=Ɵ<^Gկ,UR Wc`*E1,UW}OLUxz%U-_$\^,ެN !aw./ΊϿҭۅ'gC=7=1ZD*rN`H&2\6u!ǫW/\\oLmcf\Wڣ 2&`]odyQU4[WsW:"f6+;C/}\ە &Sk؝K]MnQR(s2Ѳ!dCؕІ0^6,ƀ1Lu#ǘ/D9hR hc a6ԯwBl:㪲1hd+LtK,csB _Qhm/?ef̝˥ʦ8h/!E,QjL"I Da/NB;4"g *sQ,b53=@F4g(A|@f v(QЇ2Yd'Q9@N̳k'AFn qLEVXJX:ΖzV_R(6j!8{]F\RbЉJ:" xI;PR\1Cps`Y)Pۨ*Х |Z]X_X=Z $a8Y0in@ǜ5f $A sxr|j_oadz>z@8+=}xX )dr( qPuNuW̺yO# HTB/tЌP^K>@A&:'Je \Fj`֝Gv}U i=9LDvLk];_.P) l\:)ȕVLU^}4( cϊ#';2Г|gݓ/x/=QQoc,ԊL0{('s=OAKk9 B9d,u%Lf6!fS:D;Vr@[ZLN F!fQd8BAS[~?rr>oӓyoO(+WNXYi:(~~."NOlOst? ̴" xR;zeN:M!b(-̰_93SJ V؍}7szBw݉1Uzi?e66g.QmZC#( F%L\ }Ҡ6-:r;s%xNEoyȴ،#ZzRYׁnMa ^=y]_z6EmQn+ Y ֹe9ӿ_;BR\0 W9e\pNlQG*_+뀨đ i lL (:92-T 4H &5',É7TV(c!W3@"j6HP'Y`<'tt4!%WK2>?]Ziy\ʞm)y(ނkAA69@6ZhII%7*?s NT,m3,7yGbhˑ 0uLiYp# izRR 0>;=Vk"=6ROF';8".óLr-˨Yd*AdhYu -rQI'7xښ"b!gEb6`r0;&eK*̬L:| E̬"uHd&)ךa1=M.d/@A)D%p\UZ(w'T(u),kڳ@x/^N ||礸qI Gs6)n v{us97YH?Kݏs=dŏx@ u_:*x=۞Gsmwb+dswBN/`~kЈϗUn+w֫ĠY-hԸugQqk9N{ufBZ{[[ciTɥ>=iyVg_SҩOtZ-vN^ao/{?_FJM ڴUC28[ٛ[2ۍc\RWh)'~209Li8k-Z;rt̕X KC +M7 V~.Pp}6Djk/rxZ$[Eݢ~f-lQI-ZޖNi­qq m4jyiDqf"~F9]qWxHݦ]KY_ fT~T*:mGtrC[|)Fo˂ߝ0>Z>ЪC{G쵞<37 ((_9xS)*0(|P"h,0C6~82#x{v+hnu 83\+NF RC,fz_($>0 "A>&NM&;:;q63\g20rYln0/Oj>֍>+}" TUuѕRV}nx;V7(k<%@9Q' m&,[yIZH Yqv>2ei,#Yb9 JÓl sqp;a\x9ȬqgH Lu[-B_A*[xՄ$r+]ko9+?M/b>._No3=/;i|Ndɑ;,,vIMrqTźCApfwpxfOvvz^?boQbA2DzG<`6JP!'54GtJT@A!T nL C)Ww^|8aIA>]7\`2BϸWY7i^sa4_dA,=ʞָlc%//(?'}g?4#W43_o֮Id\6l]0WfLTQ[_{)w~\N~ ( hy8)-*«jnϬ@:N=#[U DžZO剓$HT$6 M] !QP,NL l"*P+Y@/<_9ۗ'N_Lnt;fY_UkL|ۅϟz}>..W2SGgD$Ak$Uk,kLC D$T1HL451sD Dy &g rL,֜0} eK7X}\\=vh2[l+Ē l:kݽM{ewvyf{j-n^^>.6?Ҩ,O>ͿNLFbWzCjk\<)dz;fY/1fx^U1Ӟ{V;{UnLܜ+mW g>}JGZmH}ZJ8mq#~=/%07;l JhC< d#mR Ÿs׺ 0ʃ[aT¨bL$@|iLx {<h^ Ix' c}H dC{aˡ[2o: &-!zQAV%c]V1Z }-xhE* eXVHG@2>x̒-Gp8Hk T:RDXsx2i!䇠VNJvHV_,;ꠑ#7YI'jHZ'+m cAq/@$5e7Is$gZ)CZ 2xN AR-lB'U3kDŰJ1_X2 Ya_Ȫ/ܫ/\|}ȸe/S?N;^jx6}[iʂK2Y+ %'p/a$-'R͇yB3ٳ,M."(fJ6Re#)Ts{l7y(^upKjv~+ϳW/o>F)Oх SizTg:g?Z1?8T2LTǗz:Ao8=sC"p9Ax.=`):Z+!V _R |M t('ckE/#|?~8݂oqTgK?L/^ci6%M7_7mˠU1hVx]Ξ}/?औ"o~Ol~Jg_xq l y/9*$>Il 6 wEcF;++SS}쿁EUP m~| {cEyirhvԓrN_e˲vvTh#`<$"a"mr1^gաI^9LBj` MRg0Jij}GϧL N-H#PgI҉r~^}`]=cvWQ8d0QEݙ|,J89Nȋc$vGVo)RJ\j$CsTTUӉ[}gC:D (8<^&}f~Pzd;%H.eJ\(DŽ hBH {׉+I=|ozvvwW:@w'yϔ7G#?OM6.'A"4P\;om:u 4>Dm^eD}̺Ϟ^zT=mesZH4 t&{σe$#m B9d,cp.ifLQlur@*]gl_eȼ]0NGr?NSf5PjtrG2٧l,w6l^PfأAGEpE(mDR,!zo]ÔEEr|VeW3 lDۄ@ג 2|n|/'"Y$Vj'EFǼvm Ͳ6:JU6y/]_[lV-sc|aQmm=]\\4ږRZBX1dp6,`J:6(S]#q4Ks{[t.!α=7FZzX;#~ g9dzX~8?9-P˳ys'i|wVUIӌkQXGqcu1OVZ |ӹ#ʜRr냻u eP+\3k˵,*~;޵iGmGRN>)R]rDʜJD,i >:du^a-3DTA[]@cN#Pi6+h;#~=2`"H֗ӌ7E Y&6ZvkiO⦘ 3hC tHX*M*FRv*geߔYӎqbdV a"+Ec:F*fE4V뀒y';R2F-e!@$D -)1I (k)!8922:Ћ@"KBO?3{pRLZf-`(s֘*! bep$vJfQ!lFc4p8jRmmB*z!&3rI,m8'xHvq`LTj0^g`Viɂ$xI]s?N$xVKx۸;r> dOkO([5\@NZY+B|hʀb3K(}^QKwV3{Bsqpwf)sl&Ef c> " ^26N^@YSs>_{sxj/=y&㴕hЇx/MP'K*W4k**2WED0h{4F0CWpbZ}:}fZàFz9Ng9n#Y{A;?nx|)ʀ%TLjtI)x^M*!tT*Ǹ\qPl(yv%(Ks[XjwVU]h1FJ =GՌ+ 'Z6cGUunzɍ{=6}88mA+M^rOhn[:rf6IWyLW\p#zoG2nZ h+^sr[&6[={G6Wqȶ(v2mkG!n#ɉZjg d+ !n!n{TuIU6)FH+ 1(%T*5yRLitI:ŝa;݅iٿ ʈ~NeZxn*KZ4Q"|hF <%0 7޿l_wY( =iZ< 9V'A@-HQZt}<u44ٳ 8"`gZjKd( 2R3 9/ȍ 7B@1er0;e ːK־̬L:| KY u+H$&)ךa1<=mܗH2T\&x@TQ! y{:Օ>쿬y?1hj7Y*ōs:~UMqɨ8y.&1jkN.g8_bѥGzύLQ:ķS$XeoUGMԵ`@YjVqbsngܜ8bNZVS%>7~2I}˚r}DK6j0?֭#x_GX/'8Ɵq/|X.* hz#~9=~jVγ&5lRdW~kz?DnzhANꯃߌ?(?d9_䷿^U}ːL$ų7-##n+n7qIC2h:e^[kId"ݑ G0xd4+6rQ^iyJ?=Y\_}?}"1 ><.}_/[gN. ׷j햟F65.,UĒtk*WCNZ.76zCz?'-%eN0@p԰tLo: N/h8;^=oz(WdfZiZk>b47olη^LNvU> n+_9x@5]0 E|S,i!@ ?sARޞ Z=T{^.dၓѢ 0l&Z5FH*KJFIԮw6wlޫ"t )Be\Nz>st '@ii5#`ȵJ9ٽo0G+'}`NL*ky0ꪐk&QkwuUWWP]H ](&B.CQWZ N+ Jv@`U!רCQWZ]]*5~ n 1jŖzkt O+_PbSۯx^m,=DHyV {yZ|v:l"v~/6QA(/#t6/sGIe_§/!ޘZ[h?,nW%2j~[d#-jO] ]>Pw|'m(_K VC2 Ƣ`CpC!C DZSPW5+"`+uUȅ1s *T:ӫW6™RWD`ljޫBVz) v3dXVWG0gLrTWGfc ~&bU{mvI~\"59ƣ&[b6Y7xrԄo$a60j "5N6Mŏ_Ѡj`kʰ۸k<7< ځS//|)M4ìI}lU7FP{٠aU7&ͭwYiIZ)i~!Vhh3667q)X%`VZrJCQZ_{Tg_I#-ڏ"- ~yT}۪5A6@n⫁``1 meI#8[zX-Q&t:KOS~?KֶjlkVfv{}jg6ݿ=\?8q5l00jZn.[+ŹsJ]՘+؁WW@^ \*~p[yfI8 @ի\rk9wʭTFB[xbS0`zGe}o.z?*8xfLx|Vp~9`t{lqŽN>r`=Ɋ0R>5OMS>5OMS_$%TOsbj& |j& |voMS_>5OMS>5OMS>5OMS& |j&|Y%ׯAn]2ȭg' [ Kҥr77Fw<&3XM ޒSVvxyzAӎEvt\YFRrh jKt:w$*Bx-ZXe8*ZR~M@ Y%ъ@+nQc>2 Nœ/F+#%˿ R| kdnjo_[m35aZh+V:9ܴ1'(M\˥fߓԆ:/0ӵMCG\f:OB>Tc#N%wQ*sfD+aguN46ͬ=X|c~]^G騧OAFfaz/I9Wh\8)vZg8颗{vd%B,%_)6oKlE,p* MaДji ahݳĐg^.k[v,&rVDTQDEK`LII8j"fY)E!XCҜFIq3r煴U, Mⲭ$N LU(*t:NgU[eNR|JɬoǬ+**Whm0yrO (DG V|C}$uΑq8  Hc`p{w_, LW~;&E $h3ĒV2. ǙBiV'.Y$"N#xaG:/TRN)9K9~ś4gZZ[N*tP z<  ^^*q uDzlPy6˺:G@4 :|'@uFA<خUYk_=)Np\O phW1I/)z1jcgdc9.+{ [;~qZܼT/BEHrh̡zt5P&qU` 0 )odô XgkH?'yׄG$`ۭrҾBgr\dN@jw N7ziyNt{1y<d3:˥rc$4KA hx4!%EDu6`>h!E UH =)U6F)D;MM.xAF)K$)P5rLfŚqCmo!Z-W]lQ8zOd6׺} :)ʔ6{:3#-{ݫn:R!}׹;6O]lF6UZ?;lSx):nt\]=`E=Fl+Ē lf{I͏W>SPR!no>wA(c4鞊7l!3\=Ԝw^YjLZ9UӿMMU[,8V}K_ܪoB— 'm.;:FM`<*+dOB*f9Hiޟ.<;]ˮeS!Ie"x@Tk2.yD]@ Kù@]x'^eɫår~FxlRڒDkU2EiceOGqoY ghQCEP9T$cN4-D<bƠSbTrKJ#ctUb3P,dU,<)~rs_v܀'of8~8R? Ѻ?b?qǿ~Tk590mYq\ao.ږI±e)= 4F~rs|^|;#q+ .M:+Jn$5Ű|{1InAz!&gvzaQ[oZ7?_t '<L2}'Aau-kz8{ESGxO>K8i!?mc6H@qrH ShD>G2(Ի}l:q"-~Йy8JKtsh Pg-ךROmZ ;c$:s!@"N4maE8]3\x{a^B*zX./"FrVܢ ve۫аo:C968',wʻhO4r[߾/%@+M!]8Hlq_Kfl--+Tb[ 1-`ZtWBHEMJDq%]Xd v/4F h̊hXQI ]ӹ1y,@G*j/ZgG=ٍ Wƣ,Rp~x7X^lUT)O=Fh"k s|1c2rg tcP>.[M!>r*+US9' pt#@s ۲J ˉ+=0"ALjM<(ÁDVHu mm"nt8;xC(Szs_ǧ\r/>]_"ĕx|nq:kCn/f- =FHBdyG%G @X\Dwrn]r*"\M ^(CV@D.\†HpN'Je^&L -H(gBHF r{J&UpÏ͵؁%gޢlk{# ')11шCQY˯7 d^sOuo?Οjdñi ޻^GfˢGwp#XԽhrֻhS7p#hX[*04;)2W|X}J)q3[ީ>U@=sSmOJ-eM5$Wp5nϕPOwsJ4DvA4L׮~yt1^s_l QV7od'LsΥ>Ism w1񂿾+pϒg~.~qwZtH^w$v.XRr(J^<8@ӑ[id :;@z1Uy\t}798'r!/m> 1 tJ8 4B4)P. !O3I{Z(?Nx xbn>)\z_%>xs1y-?ͮ.=|\]ek[]QS67o&ֿ/^uy=ld}<]go$AH'E;$E}+tNFZ4 eT*$ d>֓.=mU+^Id'WwYuF$IW%\>:U=,t1!P&8FU#e0ةF kFx{/k[uUPBfuY)U <+,XN\dE ݴۆ0׹ɉ7c|Ȏ_# Qh9޿M ګ eȔ#EУ+HP`IbbZMǝ2ūew &!w yNЏ7`Ͳ%rǣutc\1)HMVŸ\۫7? m;ޖ"RVMKG޹QOә_MvOlf<ˍG{fץ<+&gR/j+-Ta}W ^Ն''5˰wQ`He!=۽!:'Kp RXI)+\X 4+BR3]jQeL[f?S,k‡k]O?26*>zqҎY ׎͛TRw].ԏCDQJՉR[&::M)3@ ;IiyŰkvL# ;QT-3-l6(Kզf9_䈁uS5 ٰdà!h-PK(PO%*( C0:*e䜋E`o( ]p08/kBê=t5ՋܿR;I#.L(_pX/ϕIzIGvDQ?]BKHt1zЕjץ]1`U h;]UnS+ ضCWEW P?]UROtutXӵ f=^hr}9*Z]1` ]UVBW-%ptJV6DW S?NBW =v(դ]"]T Ul1+\Z+FOW+#K/99ҙljKfcz=I/Y-\]Uün!+DԽMۯ5K˫w>G#6Nk#v.Jy}ټ`MivȦClt贗]X:uEi"`KzڣK/D*%z-Gd=D?I,%e E4A.:ΤN|E~!VAKEtÄ=|*D\v-99z~p_!$u KuS)ZrN0-9U h;]Ufr<"]N^m.fB+CUE:Ut[.= ?h`+t.^?H4Pt]UMlm*\UEk骢$5 ҕ¨+l_*\V誢5btUQZ ,8P Ul1+\=m(&:ABk3 Uu ]1\]1Z4vbZLS+6ݗF ]UUEK7[9^Ӏ-0v Wa+tU赫rlqW]]5WII3\s%j[m5)oy׼<-3M3Pk~)dz!>zոpj\?cK/W E!bc 1sOW%DW'HWDDK0 ]UJBWcҺ-;^[صp~(:`‰v]z%6EW XKh*\5B*JI] ]!n*cw<臖F]1kӡ+p-UL ]1\hqUE9(] ]i)r ULb$hю*J3)ҕh+lh*\h*Z3z(DW'IWn78׆seMJ6Ӡ[BA n>EL 5h%f珱AK/k%|&mAKE6k"AK.bս% %Uf\E S)ZrNI]UG/k& )atUQttE h*`NBke+tU:;]1J)DW_ ]-^#( z"U?`/DWPJ+=ծK/ ]U$Z+FK]Cn+`h;zHz/GJդ/5 ~ֶۗpi9J5HWš|W ةv|W.4㻪h赫ҘN4A_?$+D3`E ntUQꉮN -k+]+tU?{WF\ a-KK"AnCpxqџk:&iJw><^I$gO3ccR(t*ZCx D rgseUfSuOwpǍQx9bߚEavgqa|"ǎʎ-,Q*;eb>!<,jϙ)o@0oЛytx=~ϻ~\ݸ>E|iӖb].?s?o^'mY-i=4Sz#LMrdgTnlw!+ =y=@ whO$WN|@.'{Tjz^嘬8l))klԊ7AjQ甙B3 ]HRU)m)K5lXWե?Qlڠԧ;aAqVP֤#=e<t*dՂ6'FI)ַ͉ZCB[%k`Dbnь}'+S4)|n-6 7wTSԻfmV)UwPRyTse;mƮ#D34fEE^3.KNI9_}["Y߿r[h2U_}aKvڰ?n khѡtҔsPaf]| Bȥa1I{'mv( V!(>:D![z]FPG~ iMWorJ&C*5hi/<:fDyKƜ 9s,AO/Ss.ޝ7'!ИUj5Ժs!DAu@r=RNZQ ~zsߢsad;z7E"b'QjShі8:FOp3$ H/*MnT>9* X )$Z:4#Bb !cMI\=D@$X'rdf͋ŔbS>drM zXHadgEʥ: VA Κu=5B }m.5SGގ`ƤUȗBi`ES }FnQx!bm>)h-a\[Ι68AĹU X.:_@2hAgC3cMm̭Lt%ODIY`(.t-c64ˆbwY55i0o\͐'*둔YlXkY3cUh]%`}5 ɷVzҥLqA19vx X-VLCjD*1+5Kl`WP&4$:l @8׀T%&CijU2TS j,;p6L& ̫MJ+K;.SefH57]\?,1u2 DaHs5@ m:wlV2|댺5gAx`1g#n nQ!6Ks ` 7đPI!0,'T jš#3A@7V!joLEwf%Rw5Hq`Q <ڳ<;"JPAw/u4 8匂kcwՕHޫ0I]D5=() E>R]7`rT[n y3\CeR"(l,@5E@H UUBN5~d}tU;k@6TF] #Hb"ҧa|}ϯ֫^lͻ%T MIJ Xut赈 )-f)ƧvR_ɟi:c/pUdS87P*zN r;'iqH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8ȩN@pN ޜhc|N 8$Oj~N8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@O xtJN0t87(a'P N? +'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q=r'UឫqM~s'нN%N583=@N@v'8('{k=!'WtqsrN5'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@O zL[{ˋtz\\XPhwineb&Oɸd, 1.'c\@޸QKO¸GpUƹ^eO\OZJg ]i&EDWp'CWKƞ ] ^?v(# ]=A2<.k] ޝ ] yBb]eNӡלMwگ}j]YKި+* ] CWNWeBWOErJعӡ7S+џ(Y'IWљ";8GR%;=]v{2۞v Up_lE_wsr=\`廋_tEX_`b]&cYڋktׇXY{AXT߸}AXk5.֛yټ86u+oliK;lÀ?|eu;EǛnn_A%^}CO'q3[,q_rQqmˮFٷE]oƵWp Ҽضzf.yȒ"JqRedK2&cfG9|PB~'27&L|YZ=hb$VKQcXOdcYwr׳“s_gUa].r \Zu(DrwѪM!J_c$g1vUݡ+khW(JHr/=+1Ct!8 ]!\BF%#}CWĮC/_p`C yp }7hCٶ7ht{멲kSt+Ckx Prb{ztp]x!Jլ+thn;]JMEOW;`tjBZt ,p`# VBW n=]F%]`lg i}0(g}5.˺VtG]!\ڙ` vBWIW[Ÿ,(t0:dŐrT%?)~,^Ow.\r":Y4 ;o^+kb]+ltg jJBW۶T ] yzkEg5䅃z~O+D7btzJRT / jB4t JQ)[-+;CWWw&TTi`Q^]J}uڄ sRD>5t-Xw>r}vÜXk7#^9n_kߡ͜+wܝɢ@Kw>rƈ(1CBv&C3'DkZx[BH${E=ugqFK9Z"bVΟq3^p yi9Ϣ`݅[m~wA\jzk #tEZ)Zq~?o[Ylj#7˹Ohjz:j)|f|_]nzn7XحoZAn:%1sqmw= *$wM} R(N{kV }/frRǣQDdr^E4P4tLdP0D8[-trr k-},]>aQ|CRS?WI5&{&.%\hxp{c89IńKR|ԖhIw A~j=۸˦[ 6ʚK`qթ I^=+֘?Χx} ߸ծ"xuVΓ`rg]G]7-GёnӢ[àkч|Nit' O`DHB%L0YҞ:eKiRK"̏ p=[ EKPF2_nMf \*1 $" h єj=@ V;+fVyYZ-ib%+$$r4+O8DH œgYzUSLכ?&zkRC)dYGȐ6QKR$P($,"gbÅ? & 0wfiy=?9#X֖b`褼}h#)2>L?Ob IYm̓٤z؍(B6i,:4M<utՠL\;Yf7ռNU <ֽsi x|}(HuU_ ,~L7UBfumfkJcDB'sEn{0ؐXn&] H~Po@ kѯ'Xufj>V*/rZ&`gW]w]B*.ߏA~wZ-J0<ռ\|>5HG]y Ix'n:X񘾯گ9f)v`B C@?^pUuap"7`(E1=,-Ԃ ?h9a2]Ͱ\0w=LS[lm6Nw`=u]"UM:.>R76D*3>wܨF;E/\5TTGTMv<~ЊWm6W[l(5 4>5P&Y%K7&4ڧzu7]33)&z``` ?AfW ?Т/+q.J:G땣3˞E{k0Ώzۿ;ցGuWۗNeurM՜~"%58ni-!bQn˃b۷ʶL cANBܨ?PRbC [|?Kb@3u4r.=T3auU%) &5LzXPJ$cy nŦё U8.t1PZlP Nk}RW46[E08dk-|Y3Y * (ysKqjly*+l`+X셥 -H)C$ 8=1/RdaE6Z̧pKʓ3%SA %g#1zbrJs&Ti&$}\55ry gF qVxLͳvA@.k(XRR:|zB26>P]Wm0&&S$ ~-YF! Ѿ$ʋ}4LSY`"\,,!*8/U2:Mi9:<;IwB4։y='wOV^r'%|gf$CbB$9L2)d[2Ee&%$MpD.p@d) ]*@=gZi3k8}jfb?o(2jអ}u c6!VN>ߍ?2)Q|x2eZUn!t죀RYK͠%xT+sxHC٤% <V=>W-pT_du5XϪAjcG mۥUeoL5!\RBKkR2D'5D"B8Kqz5sRd*ɭx<*}NPn~J{k`<4rmU Vp`Lq1 / Ǻ~?͎Tw!C%G-4LK>8,IwyM)\ŝOlh;KզRӔY kʚ_h5 3m"k~n4G).f* ]墁HT;" B ++rB ?H2_py3ZW|[T|*>~ڔ&26  &2AeH Ye94fjtBRR23C )E/(18-"QIVg q1u; ȤW3EcpNy4/]4-i6jl{I 6 LZ$t'4cz7/h H(Xh T-Õ6$\~ɱ@CD3.1*/ךа6qGMDiYUY٘\n]Uw3/1MP8Rz#\UKor0☗[>؁]|mOXDaJ!qo$#2sv+Zr1gQ dY&IAO1{OpN3 3MUQ*@RM3ccpNxY16fr!k Yυʅ;Dxo5*aHjr iyn:No :h'|V9Eh>{%9" Nq%!Ff & \d{8CJ%)f6 163U9LCk 糸bKY7ڼg^>91-0$b7(U6Y- Q/g$FK"dP3ApLq茑va>l |Xrc[1EÌ(zFƈpʹ&)pk A/0[st r&#DsPB')8Yiʣ,#I%E5 fBsX0#6qFt=bYlMƬR^ y_jU}98f(5 ldȇH;Xo3^ٝ_,dY.YmO/vfR]]d=dm5) [Iku2Lt<7{zqCchq 0D@Ѡ3"j︯|Gy#.Gw+Stx#.#*X[FQ_9ia %MchSIZլM IlwILFkFlU@zaǔԸ)mTVwכ+yRnjf撰-I3z><\1^=^uQ@RĤo5fڇSdSÁ!".Ĥe~{wF4IzB55X N}fkC ~A8X <ɩ?9;5HTCX\J.g1)p@W+NE}_2X}nyLGڀC8Gvsyd~b5QT(T$ E&eه[O4:$ /ap9pmrZar#ZVs 3%M kl\Ü!{- w>Cm L3K9~oCxTCXt~hm H`` {,geUΑn22q9 1ߔرR?Y\1xq_|99s,;I<)f{r{i |5=p1[M!`-Dj5sby.„t< s:mӃR=2^wP 2ge*"lO*eɦRHGV%h* oQFzZmGsz(ٝ'L~R#x1bw͸6hה Db$dTqMuvP:@PŌB=ݯnݺ1'Kn<ch# 4/?mts6og,~i,zKʒƣâʼnѿ\ѿ\_F7P,ˎwH$rHVt@_'s1v.5 L975oUUFR\\pkݯo_.F[\hɽ*eV,§V5zu~q1Oge-^'WݳyZWoz9,[f}aly{o\zoJ)\hJ5Bkn/*&Sn7@xGhs W⋣dj|u9r.Q6ˏˣʞԴ%%l=A$0lWP<ܒ"/yGfG+ǽLH]xyX#ۤ3im%>ȌJ8bI750yMu2yvv}g<{]M}&Ε.{<^X=OiQ, Y󀶪٢V,rg7dQSc˪AMGcY%r:޼ܕBMi|4\sU`VdMdT)3q#lL.at!'0N^AΕ(-~ :lҾ"X4/@0%l p$G%|ˊoV{ռaIeN`ݜTKh ֱ(9N>Zq[2 :68,PݘyB(UujF۝~fe]{aK55{nz2Q6CP/^B<=yL\U*=p2%ػ.p7ﺫ!CSSq']qbg΋׋s1"Y֮.b"Cptk,&cHsu%yy+h6N mMS N.&&υA 7UH ^6]eu0^6-Z:vvOyYfTW&3a Y/X96VyPjgjtgZhNsY2+j֝ԚNE#QaN 3łiMʲl$Xcg*}SV.yѻ.)[n~k APEC:$w>6fCGtpM*3_h=IpaE+W~Uͼ9,~%64#j_=/zY4/6[A[Bjݵ%CFS À!>ZK} FvvruwIώ}SWlX#!T Tgb#vqJdL)&jإ<±AiYE{j}1JBTIKqXPqzm:[\~|5[V|mX tidKIf-Tt@,(;P6ʧU5c̤=UNxs$ `B7%[ȦZm@ c1qT9v&V:O:,~wA9)dWG~@ '-B i6LU!p5S!0c^G[beE*A]3R!XJ &5ܾ]eˋE-WzTemA+K$&G1 zbRJ٨)J:6Ec` wT[06?ƢVȮљB1m6ħ ¾ǂd0r7zҵss Sn܁r?dRK:<~`| i0U퐑!uC`"e xV7E?L>Ol<˅^;O,p}Joڶ@kT* qcB7@Y k Ei7CV9Q`vG;Cn>iFԡ@ۘ$oY2/&8b5/[׽\_kѲ[-X9N@( dBGlv<!94\SG3搴A<0]έf 'x'i;%78(8 K)Xr:1EEPo/E*J\DQQ5Vi!1 'l|a-K$])TV`Lt{*O RX{tP,pBxd8Z˛3~~ FG̾ ɲ11\~NeYpr|(OW?~?!) ]Az`}l4??WI?A ~?wKP{PB:yLlZbPԹSIlM501~pf״ho2NLyoWolO֪oq# .fd pج?,%徜-Ex%ىoZE$NK~,>&u9F!5+UXICm05.j TQMǝ9R pW*Kh d-RNpSj9k3=v=sp>]?Zfs5b^k֭G>񘣏)+H-KXcyPcVd=RJ"tJ1bteƠҒ(]PW,1Օҕ+EWF{+Lj %`A JapJѕѦJ)-DVV3'"߆zdHh? zH63OՏ7^3͜gxmd q I󷟾w#]__ u}.ySm~ۍ}\ 9*kڜ<{?_Oe[phRcokWߛڎVoZN~?黃vV 7ľPLp=Ro0Z|Q&fXoHT8+Í4sueˌYJO1/'HteAJѕ O]W |! 'v=Ohd] ☺cj0mH#j0^W2Cuus좫d=D!ʀ].Rte=Rܢ'DtX I2+erO+LaB12XLcSt8ƠQ?-wltj?j)+Eph+d\t5C] UR`ҕbtue f  ҕXNcPqh4J^t5O]I+SZ^c ZyA@g^^rlArs,cպ~i}+QWC{5i}F~ֿ ~_?(1OvY͋ūn@}ei==pV qR [Sd]q%γjUlCr%#]&K\m5kL:%O LdXҹ5"Vk֒nH|DK'7\JvRYslɉѥteu<p)!S瑩+e—+809|p_5=s-O>rjsk n97fQzfXoH聠 ])r C)RZ0u]%-c^sԕ`(WFz?EFK]ep];8 Z'?ja4rjmi5 Jqk @ ҕbteKѕ2EW3ԕ\؞W ѕѦ4u])%.v ҕ +ō@h+Lj"N>TR৘QX2rte8ƠQw5G]q ])]ѕF(EWF+qRJ^t5C]@J4bF6ueI]PWQ0%.;N̜|syɀt,fa},ڦSE4OXr-%n1(ɥl1$g -qw9Zr Ӓ3\rv K?[r)%u<)X d<'\t5K]IDWԔtrV(.o-M^WF`+<0EFGn 7҂aS{ .oCԶ +{ z.EWFK>Fn^[/^t56iM:7׹#Ε G߻_퇧 CNГ8G F.jA%7!FTYgXeM"%1Dphs|/FWt`ӱf|= 8=mQ Z:Ol +Ztu߬SA2ѕJ,EWJ+EWsԕwi [ 18 WB)RZ3+q u` ])rjW Xy2JJf+^,HWdYbteXLch2!qAbB12'>HWFd2JYt5G]wepKѕwZt${?Rd+cX^*0 ڠ{TԷ0rDlz4(]~oF7=go T"rerH_2y;@ xﰤ9 'Åbf!-ũ2.ؒK!4>bt芙n8y]%/OsԕEBP`rteYFK`]}1n Õg! %遅(6%]7A+x0џ^?7Fi2JEW3ԕG{ [AR`A,FWT?Z}^qdֿ;vzž=ddaݏK=?Sؗ>V8p ωw_ѫ7ϚZ^m[}a'Mm} !G}GzmVQ.z iy_Q-*o8ZuY wn\_]+ zSK+]1pf??߮_~}zv펽vW[ni{=VY%.h ~z}~>`a](f~򕵞R +k0}aRLRi|WQ¢Y꿍#&~(F>կgg}v޶{YWڲϷAaܳowu}GYd?޻o>;*Wgm"WwHνW/4ql}r ݁1h/AO^ʯr}zΜ 'N,c>(һXaݝ}nw6jW/\ĽN*Pɩץy0>rW}b/^^e7 ]#UQBErKݻ=.pB_qm>/͗yDfހ#߿CԄ;\kOۿ߿o?B en7nW7t֊m/9TfOKuFB&34P tU w!e5d$}+hguյ>~1}X pW7Kh@c'Cnrj뜲qM?^K>f͐v.Z r BU) R%iHM д%UЈkNX-OXc]c"ĜkHu..ftH NKnΨƬbF iE/^Ypyx;:܂6شzwi*-j~YZVMLAHM+89ZVktjB}cpZr΁ӗ:lq$;>j$RmXkyإԓ%WuGZqlΡ V b_e}TXK/-iih M$5z=U/cjVAD m;aG)=n?{Ƒe !lڪC@0X$AY1&iIG}O5)DYvĎn E6OU::]9Y;s(CEH.9msEΡd((lh B "}`q]wvHVSY2Y+T ځ`6I( \8Sq?$X9 J W4"Ta. Wj,pLA@:9g(!./5܁0g ˕3V e2aޛo0CmO,Lh-WMU(`GyRuk‚ AŜA' s9_ !.A J%[&lJ%8TR 3T l?XK(a3XEfr!L+VJ (Uv e*3jƠ(@Hq{A(X5h*fAQ 2"}_* B꾺2ZRQv JHK&#CkDyYV+D#5W"jJ() e01!jueH WmqIZQ3W])4aA aNenvۋq2mŬ*$Z@QIbYjӉsBu^L عmg.&-kvʹ=?ׂ[UP 3jYS_w uڦ:LK/T^*҇MV%^v1 r*=d`j1tLF ANjK %tD\P4I"&iY!YJP `Z*1zOA `}QHVǃVH "3hG[u`QA,TG>ONXżӬ(aۂj2P$  a%~BCLm]LՐ"b%VX@;KnCz#>U6 .MޠB->P!jUW51mtEU,C>h6y*Ze 5utѫR%lP&ڜ׃jF4F,ZnkT5^pP4i2Б98kkl#:LBi4HTy2Z;(o *"eȨJ<T9ye|^8,*VRH>K4Ԛ5'7h\A{Y ̢jTRpP*UUzˬGae ҷpZH̤E>&-K `ҹdR@`~ :^tZN,<M.e˹U&A#ҳAk.nj!`3 L[M`- XiVn I{y.j (GʪBs@m1PNf=?ڃ2 |殭väDy XRcF6G= Z.5h+r).2TA(< U@%'' \ F7lkŰnll+dO +Ip " [)o]u6!3A v*vBV% EiQIUk01g RuH` t.C127iâHYa=+NFМ"5R&~@jXf=_85B'EiLA I *tD$Ơ~],߳Fraנ\U8/ڒ?jh7t/5s!@6(=,~ 33¡-+QZkP+hX4BS#Q%#p<5QzJB%aٶ\#V@̯;^"@.* 9mِCL\Ko1.ńJI-bJx*a?uĒC't\N56ץUkT"!(TvƂ7a #fW..b  Cc:VyYtiv.yϟ|z Z3y;jDp2Ǐ>&?ZR7M)B;Sڎw%\,^MMzrn l?t\Ó`@/6z˜_dt\J/W7˓c.?C4km+\lٮ-w/NGLkOv_nŇжca?mMƾդxPC2 O=\Ac\lph:x'PzKNrr@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9ȇ19\ B ԶһwH1Tp@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DNGR#r04h@+&P* @B@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 x@UF`?"'D63Z 1:Pf DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@8.o}qkLV~{}\Phw*%/'iq6_RqIz1.}z4%hKܓq1>- fGCW 0cֹC&6%zteZ9dFDWl ]5|4ꪡ1\(]5J]=B͞t1v _7w[Hm~\k+4P仝Z-D}>`BM>17yo&9l>;vS=aJn.`t7>Yy{h 'sLx6O 녋'nc]]ix1_,WӴ<>-,UĸëSͬݫ]=xpًt4R}džd|yjԉ|3Yy6 *fdR*.ەG>[S]EtJyUAYdvR$ WO&vd]w+뤔S·\( 5').(E?/rlKz|:R5)yq;S#d0nյfE WS.S톔wŵQ%7֎ޛCpI#NU<׏y@8tj(w"+vDtmshn5w!4;)jteo8 ۇ4uitu;ivhۡV [Е%s#+Vڏ܇^گOW DWқ׊UCݡP?]IcQ#+W 5jh ?tj("zt׆\BW NW@i ztpB`;ߋUCyh??ZUƳv~ՠdl( ]=FLp?Om(ҋ[K￶1 ޼.iZv9o[L>o;"\'7$dB'fAw)۹h]q$9ZŘVe`?U|-V PJA2Q78%#XUGCWUChc+Ve`GCW -waOCWC0gpNWC?5vp/"6죫ۡ<Et刮:\ exц]hL˒'o鲅%~>+ x"/?_uW_נ0N\:%}蜉󂥐Lr!\>]3zQ0ԯ6~aOjZ7W@?@j'j_o,^lJE$&_Rv k Ja%1wm~{uA*ݦOśm'ygpouwXyG ɓ7Gy}?Eos}X6w NV߯s۲E=nWl|{ ~:oOiK9`p_yvoO-nJ\H/ϿeiޞQV9f\J]"t3aק \_JìYis.I_t{qvpo`M/y[ 7\6)fY[re.'ɋSH>MmO-uGvƩY]2HˮmoSv^)Nq;(Dɓ`Ql,29AVJ4"qU0ABU^w΍^'YUI9+[*<')'<٢d/nGmZ$>ƾ˺팡-@p"P99o3'+:߰l./|sCb~\}.۾{7^BAms\oEx ݝW\1`>^:jKky/;eYӝTBJ~+O.n)^?g1A-uv φ$7rYoǜ99$NBl$sLhƙG4y3 sۦ5]ۃs|u)Fd0=[K }{c_T6\ es<JVI.nnf*nSܸ5Gxgw!&o^N}ڄ ~eb~E/.~Bi]7rCȴc9S1;|a<*cJ_)z sYRu&{ Ž2m`^N_O\ piϞ/Xnl6rp,;˜Hʳs|,٢3HUQpEB!%bdۺllf6РosrN³Ŷ7pQHI/o/~Ʌ]5{q>zկfv'YW+vYi=~}7CJ|q?i^>hu.P#hޔIRl^mۋcY,Wo:5COΖıla3Ϸ,ʕA.Wu%\źب,ӛ?|խ_rz7[q_VFbVJM?t0]L?|tݗ Z2ٝxp2`$6Z:]KBV߱Vଦ_~ٝS?J5dk=~w__iG׳ ozÖBm>eKXT[SY77:}?-G|ϤwcvMO#;٭iw=ޯfUx䀙7Fևtc+;>O+\MgUGOQkuj?zEyΤ0޵iPȠHHeHywn;޳bZ K>ĎolI~~=+XC،/yN DZr9 2Z:\n K$Hv" ++&?hОW%rM,ChxRu!k(z-"$[بuJV%8I2`$ChCh}FgV{)$ߛ_w^mM7QdItFt%`R*@2d mj jlQZR6nd2$S֑)0བ6 j3qj~W*<{wz}6 |7ӓoq`!+jm:Æf#winלm<$wաzRn<45+'}nif2i^n,Km_5`4竽WLjv9t[֡z*}s.:^ԠC#Wo$wB7ȝCeh"tj9oZA&DUXR!)EI'Ohm%/c,1=ŶɩsJ i>Դq`LpErA Zls;4=#)}6JUISNj ?eҁtÎoŎ#ԇh\RH!4Y' bb319YSތY75)8DFv 90苔x%%6_Lro}8j 7Ϳܤ21h8qN˺ۍ:31c6!2jנ%mlUXYc}l:F|Fsާ6)͊3 8 `ܑMN %c(Bj.* ZPN#kBj]𮯠s;-&ٽe>Rhe7v;p0g]10B9's+)޵z]-ort g{ZT (#B-C6u(ƔTAH8f8f8f8fi`O ET?* ]$;!HZQFՍvLGM)*|iSt. }KVHB22HO&)_3qfٟM2P;d]_}k:om7hגg7O(8^ ,UκDhD6l2f>G@F,.`vT{Zkd3lHA6L %&؈߈XYs2#Fulpr=Th=O^o>u냍Wm?ty,=փw}!gOjݬUQ-DhЦ BG糗9FOD>U_ R~Btxͣr-O]7s4`z.]@ 'vX^l =cmQ13KAY T$xLFCCBO-x;ru>s :3uS>ιЎ. 0pw8,o]fl3ϕ-/)0jo)L,i9ӲI Hv<{ڑgw\>!(j=ɳWZyUϠ}&KY+@GIZ(aD "BHJЂ qL~Q:GCcrwYoqѿ-__uZExLd1TЈ4WH"J0hA [J$Puqu1Jjw T,,C[ b}Vy- iQ1ߪIM9 {.gׯӜ|:]nl=vM%=W7xmb NÔ}%\LKaDWLIIVWxC]2{kߌ~Wfr|' Y0N8={Cawz4C*8V!f%ܦְb}VDN>3zY\Vpp#XU7ZQz)3h -~E ,..>_>@ 嫛M졄?odt:=8p1MKPb~S}JO_ gD~C9H:3f8X*%Ҟ7tGf[:7de}>u3gwubVfV99']+ {߫'iՋt]sFW=}&1tfehOˎ )Yw]6*D896}yxNKT(XE@T23%?3$ BA"Nc iVYm5H܋Ma(E I JB41[+lgG*^Tux{>uƱeVObА=V;Ǔ2/SN5?7R./tG\q!uƼ[.Stc6ڛ@ ԇ{M%:}̍T~ɟYDCl#a8X#T 6*WeDUN2& 'mS 2{>YHbvlL1PҩxK00ؘ5Tk@:#Vv3+=Z=鵞Jݼqn}l34ld+۬58xsvD7 -&j2Q[ IJՉR7_|u ѱLA;RԒ9˗]_zn֍w_ =[PH zFHRV('왆18%tΕZdsZno{=YpSWŢY@6"[ u )%)r 9jge`S6cJph-0ibY;\vP2g;MYgWxfjo`K1:BL@PT^A!JIֻ i/#G8 g14&í21gP IC&.}``{f|P+GǤIߎ{g|,hZ,8_ %âFQqRxDu8M'vV }vm?h:14h&F:/$TA X%r.G"5媵!.гc1TdGʄi>(j!BBjr: ɴt+1,EPLE&EMhk0ձXGouL(KL^tɮ萶;dMx"ҭ%)('ZlH&8{/6EF./xe{O%)^j3vm ִ>JzkM/ܒ[䯎_ՒQH*Vec4pRimB*s ,~,Sl8̦<8B (g9\ySF$xU}s?N IFxu+ißA1D"a8zC0$i bhXD( KF Y*s Qj'#r! ݠdp)v)5B%=:CݳtRpR8 L!cY6GLiI+dԚČZc)6Jڕ ݆PWU+uHBiB.{Nl2mbsN S}qI]uήƱFdc L8U$TD8NWiYV G*$p>(F6l Y#)B]c =?ѷ mA5H@ trYsC ZΚWCNgڭSXnxYWݿo^-r{7/3!mo|?^'FܡqpW-hFOZ+폓yOJ}3]t&InMʀ\xgg݀xCuuٲWE&eb6Lц2H/o}wYЛɼ.h-^^- Hxa Kow\Ŧ7tc`pC6n:g~9w@7hf&|I)c {{]/vzҐ66W4EN>_6Koǧ#YЇm>k=SXa"Q "V8'[Yh h(CT=7yJ})Ovv̌q1%W{.KɅ+2E0 -0LE(~j^.m/ hI>|/lzAFj)nA˃Af@lbU*Gyzg{@C:rʭ ڠ4]EC,dovJW._ÉYq8\Aj淭upuUrd6^>?nxN4jpLLI)T$l:gU)`h \^ Ŷm7jXwUkln_bsY`&80:tOOJFde{c?K k/T\|m<`PRlQUi=Uj 6$ ZQ IkEno"HD%V!(&ĠwԔIN0YjZMҁS[^ 2\{Zq d-% :_licStU/]*2hEEI X#yJjɸ *avqdiR֠t;Bsd[q-ґ±R=XeD%T@EQq#z{lH3Dց*`$VivpUoW_xMc!gMcr`A9$²΅d8dfx&c50RK_~qcsYGGOEeyȠf"ɀRq 񀨢 *$}Bws=ufۉsn3Û>tYI dz G %s6).K_ӫdQϚ\t1~l.CA!P|8]Ld[Mww}qjUWv4qcqV0MY9ܯ9-*GJq1^!GjVuW+:W QZugdKcWH޻̷ߕ]:=o~_Y':&2>0JSt}tk~$& Nju|K)#KKjzJR2{rEu{%Fc\ҥ.ɹ5RrOld6a ][td^1 xxt+ZtE/rXOx)7gOdFE/|7TtGt*;|QYTtl<'@oH%*Tԯ̇/l,TՃ c/+ R2Y2$Ѿ%Q4jTVMwNKgW4t?]ovk}^+3~旿ܼ%Cc;lmsy?s|4o޴eZua,FE0$RY<30!@?F΢LoRU‘sf,Z`1LZ FH*CJ\4Pͽ$37x&GCnQ 4c`jӌ3L_߽HZ7)M~/ɏ>q +JdUW⮊F*RZW tWi ן~nlJmΚ`KUnIA >P -4џ_{L'š&g\[[.fڞ kO㱍cOKE\d#HZ86屢BT/4Wy/b̏S|Z6<zQD@'ro2ƭ͔iǓ|gn&g9Jb_\ҨG.9 eFm(rrY mM)f*.y /.,S0Km&џ+o^A,PJԂժ zL~~2e!VJZSn+KR(E ݡa}|cф{.D+}_+`KU:J.Uɥ*T%R\ETXMUr*T%R\KUrJ.\J.Uɥ*T%R\KUrWZɥ*TUJ.Uɥ*T%R\KUrJ.Uɥ*T%QP%R\KUrJ.U 6 'N nSZio}"-(e5YO.jAoՇ0PGTaj g`*Ĝ0Sx- &v҂d ,;u(+ +"Җ9x t砀`9!:4:shB#cgכ8Uxu1 Ց_A1D(5Bj)g]&u)SbE?l e<KI/}G#C>[9iZs)ulXz;U43օ]B}ʹLXĜp:C jNR{ȣ}K)wc/x)N''L - qBb`d{0N+{ [;|Z'@[,ɠ|tcoAIIСFH~$鶏 Oiy^:yd50@=2U~ L`R@n \M{6_CZؙqtt-f  R 1YA }L$-G|OR{^"Ӻ##}OQ0V MJo &:5;LIN$ߐ4CiFv;d0FVy*?_6k'rҚAd0lF9-br2SFTBo z,sq@LAqR!W ,%˜syֿ7q8sР rXݛj^<:&;blXA0Ye'gM.430lz:ը_0-?-#~GKJ3Mzyq>bXڶX8}=p.?(٘n.|!^R[d6Uf*W-lV[6ƛ%[e׾~D|3J5p@ f~vҿi{\6 gf/.Rt~6}ma:eAΈ`|#=KsL9 1m:2ITF{&h$Fh2i;f]9Cj95ǗqAb jW6TFmP{`3=Y9<wi7Bd}r %#bH`1TC2,[%)db42$CX Z$a&J.i# $AՆ[Q_Ob*x,Xm~싈2"Dܘܖyn-p,j2kC"TUtF4G$b33%-c#rƴ]ֆC4R3>`eIsCd$RB@e(+#blGO~jO8G.,Ue\T.1h N@qn5h {Z Db` 2ĘU.͎}PWC*w?mcs)n~|G%.%ٔ8IlEWh5hj"K+@C*;(Sx^cyEC*{ &-0]"J"6ڈZ , S ܢ鐿xje&F1CdL>q(˶7d&*9Kțb>ɪ<8h,hO1b:t`yZI#/%L>Vm8j ZX*))C }cMaoL 8.?IFr=ob= d$PyZʭU*%E7B#,1+E;'itQYQRR81@B*Z<7+bQr 26휆0qRdI oq˛5 /oo.۵p\yfeہi= r+m3<~V W9ԍ(ɳl02``>\KΏ\]3AM@eMBT M \@^b蕓b\Кdn$—"A #U)>&)tiR$Lf3sՆc$ch&?13>f_ߔlnsՇb<'Oݺ2W-+>&Y{x;XRG-I5ʬT̨XɐY X@0Qr g!YJ Pr?` s̚J&gEAgm*H4ŘeYeG>wNV,?͔|8ݡ4]nuo-vPN׏ԏo8x4@pQDF$2iN8ˀ!R"'2 SxƓiZ绣&g7=V}S5O-5[n;6q7{,1nخ0Zy;a7ϝN_Q1q,WN)l^x2iuFrd1GeQȘA1Sy` Ah\F5cէG?Jh^{$۔N ebB:&RW>]/& DU2C~LxrNVD> mE^xXk:Jm.ίIù';7m]3_j|;zƓܹ֡{/c/j|wSyU/י^}b<>d~.Izpy*vΏ>lLvюL3GvDv.Kù=Rò^wG_?XKGjB} * .!^߶[<v/qkwYKhjGz=h_W~nko0WG]I]zr95`?8<,O4{#j{?v%%dӁ?za֏i25FZX? Y Mt;5jAj*yIZ8pD%i8Jre])b1KDX;DgY0m扤YJ)zF%+ >*Aolb)#UV %/fMp#Y,)`cB*0WX}ӳc2=-MۭY 'ruBU׮t6|Ꮨ.ƗGcQ[$65ҡjdy ƅAHC%>J MitOd<̅<]&탩:wC@9;EOO"vT}1ރMNHF'BN ,80 }JBѻ6֘ǫO#ASYtdv)*B;H'0l]`8%%rg ٗ+H=ύT72D8aq+W i^huvyןzm|cľ`F̴"vB%hc9'}H, oCKƣu]ej7,vBlzvne >y˙CmxqRe3,#34q֓vLru#݃*{8!mr:BgKʉ#H.i@2Vjn+yw`Ǔx=[zc=C[$WkVZn|"ei7%86]ku1ju̙)ƒ.af6!cǭх3njUF%_+ߜZMK%RfڂS&P*AFFD,j!z,.Edƛtj@ٙjrMFs @2`''&f71@J#0X#뮇VyToQՑ::cՑm_M WÈ/fj Wpz vpesizz^vK$}hyz:?.GӇ:A.;pϑR\h,5Ձܗw7PcoM,xzz6׿ht(g_2B;[iIʑ #&CR6=H+mTlsRrrk?[p0]*h:H; vs5b iHlb c<%r֚ ^` (u@ -.yu20 EBĬi<8)bDbֱ/lm|*u- ;T[HM6+-'t) 4*&ܐ<dST rC.gs2/ٺ|.tm^-ůL%\!LaeA $޺K!{ѹcoPb<_ܙ\i]@d^eR.ɰ\&[1:~4g2Z? ju.Y[U]m|V=))'&E#aH:6[5J{\$˱DMs؂N̐}ʥrWE,4w;Q)חa$,4Sx7G[:6i2i'dzmϳѬo͞=8]-ߖ4?/{(xVNO:7-P]`UP&4+]|:o[[LNe1|5@˾K$,u(h'h~Fx$b:u#<=:ߺ/곉7D M M'l#oUg/ֿt_rW= u#i"(^[ԙ:8Zm1M/dxSV?,5cҔDhU&ʥL O/KS<ӛZw)q#a3꧊=ye=:ӊ/f8ki˲??ҭisL4Рeq55+}/| 7.Eq%m/LLV@O3?EMNgliýοp}!&o% o,a>y/5x$uOl>¿;UxxOo*89PeIm%*%]0)$B%.-eoJ=מIՐ@;g8_-S!RRrLd(-.=A{=3MmR/~n[B{u訟O--<@ JKQ|y0riG 2m{Rk ƳMhshU. I[g/ P" xj#Kl@Ax (WNZmDSNgyIENЈ#.F]kV$KƓ68"UJ Њh]O݂PS>K\)X)TjbJ\T4TlAmuE |y~>YTSdb"&UlP6]" @DkVc{cO?vkoxudИQ:5yz)͓w$/ 0B;T^K*\I=G}~ϤyBϢΆWHRUkKhDhbVZ*pKZTTs @4dQ!ia&%y-y\,n԰syYs{M;̼}nQ"<`~:啈x\|-@;.[+Ct;P; v2qDCߣ&y *]g|Ӭp4Fzjrq罥 FCp9[ (Qb n˫ OתT["YĖ/cw\[inc\ן䛊Lҏ'-|yϳɅ&/2'4Ҹ8L~mhc|zcӫ_>N6ZoM~cᔺ/trqnȁՙ[6Ko|i0]KZ-HَwW"-6K鲅܃떏fi& a^\n"'e](v;_FE燛 QL`Wۑ"z[GnPrj[Mps}s&7Ts֘) y+QR&H|te&cB)J /h}NmpHmך7qN3w*1 .Ũ3*d3)"ڨ)VƘ]%vXtKc&OEV\8mɢɾ4(^5:S8˞H~ $O5X pW|ZYL- TaT<¿A_ww=IZ>}3}ۜJXbpWzf\G:5d+/OHquĊ|z5yL$)#Sd;%%Qɒʁ"d*ѹ j'o'+PuNwe;(\UŨǠ5MPUIpTMyh`ݹȘx~J >ߎ@\r?GuG4z?f^'mN m!)UsԞ `$6xkc8+,''B2f`&f憎*$.LI TU m-j"$χdG֝J6q$翂\5g{T$\ŕ*u&Q& @JKg}AKiMa<=/$yYGLaAg(vytZ #SɄ2Y#r=Y]x'-_o`8_f?^gY<|sf?3 ee-lv͡b ֩\܇Aܓ${sO~a!*g8Br JojZq$A|X5d('uj9nҝSukRulng7 PVy|D՘$'aȎGvkm]{ODž v Ϸݞ]t}ͻ<nIKװ#W Urq[Tss&l i]猠'm=;Fwsfi{6SoQZO:Ueg8ogٿK"ND::u%ggi+݀3kiGӎbKʦb(bt)L@ѨCV&26>n>c*ow;>.A6h TT#SLd YT;@2H* P*W› [@1ײ2(cFbhp :k t7v?`ZwPU6+yzy;z6lm9HSTޡĒKc_;@gE::o5T"JFqJ-NOI j \* CO.:L)'W H$F* Rtvh݌4f/X&,<(zsqY@vOxq?//pv0vb 2P Q$a 0h!qm9(QRBP*l #E'[nZ%"CZqk5; s4ZiǾ Q&Ԟ6guYGR&xF-}7AC ggb%RۚlXbpZ"{ yf2GȖ&eEđ1?jIDv%jͦ8N^EƱ b31"℈"n t$ AA'e(ӒA)gH JI!EF+!YTLH!:&iȞB d%/1"6Ee~ԁqq\^g3-uc\.NMV5Ţ.zuWOJgXǓ5>PN>6Ч()1%'\<.iǾxha û_k19 s7E?L#}~)9Ql"uEu`COshIin0;,FF#wQ;aQ;.J䚂I1AپI֪m/|b6S>J)2(Щ]"f*!$9$Э8^|Ksqzl769yx{Nr)J>74wdJi^vz$1_}!O>/>=iVlݿPb/iӅ缿?|OfigE?3#gV2 0f&s}Me}uG !FM-Ǯ+cæ*utoo lNyPվrqT~8gT+Z:4LNCvڸ*djø7=у oĸ/);]{.IGtyf`GI I|5PǖO6-z(5݃{ @NbJT˵ ,6ҒdSɺS[-h mBj.ʈʢ. HRz@žFLtI扊MgEBO4 =7qrwAZ>_lmg8#^qyA5}1oߟ>kz6ɠM֩Vn 6!Qb@RʄAk5a& g#iT(<+RHiDS (a#IhOw6s< )|:a~X2v(37[=,xZhzs(j?UG0 Eݚ! f>Fbx1??#!AV>rnw"ȷe 5]WzGf6o~tLˁW'ZSz.oZ/AÍ>0TF3JޗٟWGwR^vThRݏ {.IݼiX/ٵ߬ݟty?/C1K8#|mBuYe`eiNÔrcLx kGf#}nnFWQ5<}xJ+r7hv`;cnƻX#ϗ+5Y7Q }ڱFgNMnD}ig}۬xA˴۵D*'R.X]N2 Di }4%=4V_,~HyC^5 gܭo!nPȣ|$!% t@AGHt* ^Xv|qy>bt㵷i GU>GVx4yx2ʛ^o+v't;٦KUB  A֝$6OS.ara`t`D`$DN 1i~DΛ(]NI1BrIIxL'w{zY{1>*1XUeub(klY[S"k`6K.Ɂm b]'ڽud{&~EMMDjo?u2/"UWSuUD}jOOg_Ǔ6T)ТV֝h;Nh}<yUѹW쎓*K ?jﺩ%BSK-a$yT i2@!JYKiƀd,H俱ufyWU ~_~:X3oxG_.b8SNqۍ2W٥5eM ˇEJN/X+QBIl`U)98T~.VENrGXOvv䪱l<,,2աVwhO)JZf673 ɊCBT^l`bTE;3&d䡱5Ύv߿3ۤPlQZ li}x yI$6UH+P1!xKEkԖ\ԓ 4_z~a쐉RpNdEK1 b@ ֒ץqjŋ1oΰE5B^"w@/BTa%TBoQM8FYL'ʬ7ڷ̬$ 8ީ?d]4`eD+57 *+Sp%:ݎrz9k"{|"u-v)*]r. Bc8iGQ&Ak5m74+2XG3 :'} 5n^.h] Cd>u`JO$j& a 6y`46<dA:F[o/Jc<9! j@]0LǝΌj  fЌ^3I(OD@LL2eX m `|WB$Z&RRZ{v22 y᳐~oȶq1xp RXI^iSV\hLMg hRThKZ g]9ܳ[R7a>w "}<-€=;8NQ/t<HAN&ur޷ 䣔ڦ:w"RMB(f@ ;ދ:AVmxܺ7O&Z$q·rĺwkvﹾx}{ u)HTKULWaԘ 7:txܙ#uU\cLZmtup&ZW{LҨ{poY/__^vfT+}h#<ܧX(ѯz,x\,WP.z]ڽ {m}y8vyj۲E5Z!z$۳E:͢nES]Ӆ9Xd,I}vyu8)xU 0|z>,_-v [Y|U+r+iTN'V-6No?|! +8R 1Aqі6(0 #  LIRPN/Gq7\WBksוP4(3J]E ]AȦ1dh])-fK)tRtfsb u•o'nAuՏ`'+NڧAgt!+}G tuF+KڒtѕFW󏮔rF+ ҕ;SכRt+dt5B]P] 0+ŵ])-e+tƨ+ ҕr:jWZ^WJʻM,HW L<T\Kѕ2iGW;7<%{j!zBq>ty%o.e ԗ/͗ueή"jrqyhoJX04:9d^V -P7y8דּUM -:gYz{qys{VZSkso5%7v*1CJFD5oλ"O w懕;3GjN)yLM7&5% ΥJiKv[\rlDfylkВ+'<@A=9@gޓJԓaO=1XOѕFSc='M)FŒ&M #,FW])mu<|9NL0`_x)pvӸO@aQt=t-z6(HWx.FW] - G[gpճo8P1R\Rt1+[̤ GSRtdѕb+ &w])%IW# R8-FWkc)RZsוRnؓƣ+Xۂt.+ ])m>Jk&]QWw%u8+b+ɠRbt5B]N'e b1F7.rԷRlΑdxT XP ux|bcq낿_-JE xc׮iIZ_Ҩ)'nP\(sA1 술G \x1Uv%(t5B]E] 0k\?\ Ji=+ ^hǢ!O<lcηu EW(9-ĩhվEBxn (+0jmcW|4R\Rt+St5F]ar%u+uPuT?JIW#tT])pbt%ޘRtJ)L QAhL1R\(fJi1{])en Oz]y6 ])/GW]R8j &zyy ̀8" ./n0s?OOh4tܖ2`U{ڬO Ueic)Te9OW!B<.brP~@K/\o㩴OSOn=9\IOm9c)RZ8R1*JmJѕb)mu%Lz9r;;1@K/`7xtwă~>]\]IW=D~ޱCu+}#JhuMXKҕJq=+.w] t5B]qh\ARP7Rt%wqu {o,FW;yRh:OeƮƨ+CIѕSAѕb:B0J)s۹xճ*6\e pjGغGqcX6n/R7nx@K 8@AqGQ)qrqCoLqh+PΌ"ō] E]WJFEWZ ] pru<=9;?16l돯]u,T~sjQi"OΘ{d˖vg!Wk{8k^¼ݣC}n:Bݥk!H7iJ'f.¿߼K*w\'Dk36Ac+UԃA*P{yNkHHso^ݜ~U}{v#7yn|_sIiǥm6&BH-wUn ɡmWqH4.v~nm*n4 YF41uօTUʅ PLSJolp%Ù_Հ@i(iB(en#&,$K6uP%͉$CVZ 2c҆Mm+5Բ!m#USKaD jJ2:U)b~+d-cH}+r}݅TR9W7E:g@:ε +\((t1JRSPCh,YAjĘ}]TPMi9vu ^_; R)Um%+k T Im.`гj EMalPUH%ɏ \խ*:](xIwⷆNZ(" Mѷ@N"%n'ir[~VZ>&)6ims -%ښ)H|IRCbMsZߴWM ҞXc]#iBb%ɚք-.uX1P/l5H5g@gbh ޅKSA'(i IA+M% HbcRb$3J[蠕B|7kI;e9wR'v{ z8j5`D)Q6N84鰐A=aP.nvߊq)tǬ䤡bLbb󢐴p8!bEw!v> ؾ1PWM^i $z] UA׮z|ou/c&>Hh"X # o b9PT8x0olJ?n:@CV+&{H\!iZTUFB1L: !'`Ge %tF\8gP4I"i Qk2!`(`G~OC |Y%H Vjx$ۑ ox.Q,TGW>/AXŸۆjbPb$Sa>F_ [q'K =`Ye> ]{ #BK|u)}=Q D Q2}CGP=n;H H jQ%PKcEX A%DEWHPl5CkHhmB[v#hXD+Y w]zʠAUbF =>A( =`-f=*-+P!>now`E^QH"NdviP'7 B`|ԛDyE aLO(ƈMnƢb$ӽw:Hc[qOҕȪTcitFͩm X{uVjҢ:jփ*M3| R5zfҼLFALJF]Ρ~]뜼<ܲNKރ v[j-z5Z!@6(-|VT@9 Z6i =W&Eυ4CS3AQ%#p=@{@ Dpqno/0?/b;n*b]bA3mjl{IO1_ym|ɾ!tts\̾Y78ӳ\*Dti͗RL^y}{ ++Ʃm붙L9qf6a|HLh"" |'FWź`g z3e!6qN /wzN@54; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@ MTSq P~='Cs;. @ d%`'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v:1 rQm%'q*Np"Q1: Tg'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v: / |5'5'4'J @^oN v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b' z޻ŎKjJW=No'&ԿVyZn` ImaMa:%z2%uK@"qޑl3, ̎k֛S+B;=J4?n)؟w( u?nQɳ 偞 #f/f_^!Du)3 g"bvlEaYg L %.LxbCXj3_*pӲիqXQ%W%t̯~yvS ^*jw_2➞=¢DVԼ_5\dEץojnSR w_/ (be~U+chWFt*_ ͧ4}0E!變Qٷ~ ^ըʃBJ!{?#6Voe!CJ*ɭNFr'ƀ,=Akb} a(c(oTpP~2DhC8J(sc%vJ8H1"]DW֊S+Byn>㡫bBtE ]Yj{tE(-'ЕO {|f\ҞX1AWc^F̔ +'CW֋S+Byj> ]o&DW8zBWS+B#+#{=<ب8"stVJGHWvBtVUj*tEhҕG=\+TZ{tE(o=fz-}tiM nSiwRy8MJEi#m{~@ r^CMoޯߺD;M4KRuXLLv!dZXEV߲ P)ȑW/W׽rvAL$EŚKQYC jS\ΡKK|Hdqk$GqnJg2Y?H+d>xY(Ua֏2*'DWt%E%̓:U":]ѮMf2]Mv}x"5^׺KWC>,4Boy^i1V2Л7pam#]-v߳]D<˫*7[셒?ܬӲ˫9񽻻~D̸?x~׳_grwb?osOdNćbn |@8Y=n={8?)HSk%46!>9|-yjNm4:z+9:l6~>ߞ-7`n]gCHmoɟb2Bۓz=&ۑ xbzKW;:H~yǏ^-7/>o;ax—_ݷ9 <ռq/6F;NH/b}~<6Pjֿ.ol~E1&RFHBM|nUIsdR)=w ҋ@HCwOqP}#$h5k3ړ]wQ}5^HZ3ڇҙ~FXޅ1g @KKA k^<&.SE/I5>Z{=*1Hov쭋RYО*Xbpj,*Yg-e#༟Sb}B Ϝlx6 _\Cɗgt {FwFaDxw}|yz/vy9m'''rUjL$he $-EI T|K`9.7*<r?J[FkS鮴vM,c6ynUmY)0$J"]lFEf0TOw$t8?3sL~co[tJKD-6#z(Te248V\77yoJcC1(:ve7Hz$NxwۧzU~i?8?ti||b.ϗyOuWs탳 9ws?+|6ŗ.Xd C @2E,H% ~òDzeecy0~tSE6`Տۚ4uQ + q7atR yri:jKΔ"ĽY2kƭ6ӚPڛqvTH.q@UssQk>qa[nnC@M1CnǬs>}(?6ڵ֐trCf]:ΞXwmEq:.+k~{!'~^8}?+kh+}37vקPe'^Zi]GsbB2? oaQ4TSs|}A*w$0ӷql4y6{_ E~vpCJA6.y{Y}1n79Z0t$;:Z8;O~+.uD1Cd| NVK)9'/=ۅ{VdfFi YX 74Xt~X[~gojTxÈaYz3NĴ + (͖LkJ pX*h߰ ZV3#x?̀4> eB\},!bv>$}Į4먔tPCQ.(T~o+7]|U~1a2kѩ k(V9R ,^ 2,2a:>ʋY] ܪ'a޲&*qXm9V{o9^)fSmtvγ=M9x;T_[` {3掓&Qlwݫ@vyi8^*n=;ys^^~>8Bbg6e%Lih,l!}+g=;S85S33f;g!¼ aE6lh"j]iЂ [/y p8+4M'wN{X Nߎָ[탑Er%g*cH B,mW6CbF~SV.ٮ\R[5H>!RN)]4žC q{/zͯt /#_%ԁR3X=By<5ˋDk:Y~Nݘ[I/*:}y?6RZ ]6d@U۬t,کR(zP^T$7C$ gǾ #0p TRřXA8Lv+r%%֋4TV/H~>Ҁ>_jZx^:J:ʣqB2Ij/PG!IZQZ[%˹!{0vvi[%68Qq MD:^H*SJJP*XPȶQ1\0l!ثuJkIW%V !XH2OE pyno߈yXͿ,I[^ӹok,#a7fq01_t`6l켼7 Jl:Z{l:JذV(~u041n \'O_{U`l!euk}^hʃoy}7oyۑX=>k!YU(A81V&%p@ƑasIE}a߫XfG-^Nfm6§O-Ejzo52CA.b%|EY8fzChGD)G 6HS0 =Ik/9LYJ}'6%^;xZ%-Yz.%9ߧ`@+Œ~wh'!-Ѻ<٥v@7( LhO'{ڃi 7([^f&("[eё?8HBmk6|,~-"Yp&xJڤ\G:`hL,l`/Cw<q]|}iK{錚%&ܭ򑚚H5Wg4jܮ*+ inU^ci4)c}}V4J8Y#jf޺ϱK+r+6UIOj6`꽒礳z Wel-'KC1D9 mpI9)^;s@m| :^fli%e1ژ"$52Jh+]`# j\6ہ A:o),AsRo: G6d3)T6fdAdl_"-zxkE/Y< ٻAC%G>+9Y²&WX9gΔGdU1w|ay"-X>msRΒr[s,&;&c dA MԽSc(fS,Udq>pի7aL N$NVPȊ$LLVSTD^V>J._'ףtRM᱊Xa08pdTz"(T$ExB0EI*znAmFVBZ+(TjjIȃK>{N%c:s2A+/ߊi*c6j} B#0}&}5&58Ej&4,dp:AIe"ot׏9D 63|ŽCJ>6]|>bhQ]}%x9YXG9`Ҵ_uɵTxnmӧڨ.@sLia u(uoc/x/v{DNeٛTp$={5L\:'G)G4 gEൣ%&!("PZk+7\Jyns]'h4Bڏ˗1f#qsTEע{^ʲ''rt8A9Ϸ[>_y4cEc. 3KBE_9tMɪ'+ <Ǟ'hsfٺYrAd3=cr԰ Ҫ-y5&+/+|{7zw3^uX|fifl_lv[dݖwn' tA$ 6&vtQ+Lud/`Α`0n{ 9p, d7E1B s=VpM<^s  :lsm1iݢs@|fl[ أ 8dpV1┭Y RrLHْ׺ͧv5K!c.<ĢQBF"m -=GM QrEF&(BFă2$}TcR—D,cG{Qn-W_޵QP6?{Ƒl X~0@v/6Fp^GE"$%ǹ~gH2ٔ(Dv8VLuS觲֓0W$ay&RhXsxRA\S 3Zcc;Wk;$3m-A5D@kiҿLIgc+CyU `UÌp7g]Ə(_:i#b]G ȷz+&B9gIS8Bp ZhK\bR 8PPU?Bk[z5f+e첥3#E)л$.Y V4GPt}eyz^hI8D Z(oi8~,r%9f%RRG=Q(8e$*M2 ,(@ 1P =CC!ehu6m i![Jw'\iGN!9kARuX~BuL-$ϥhqd0=C(<4as^[t} qόk?4rӕ_;혰=lQ TC,ׇ綶cCz}]ykݣu.f|]M+Urd9Hrh.q٭>u6!emÍtOs81b4 qd#?wO> K#CC9y5~w76 TG l~=לep;='-;}=YsIS\Goyhuq{qqWA!4\kں:xyl18k Ý *}]^w~Ղ^7ʯ_uv3'J_fMnV]`|Tv_E.?^(Ld殫 pV\76IcV5ºEC#CM9|XӳO<=]z6?l~ٞAuFF@ )П3[T$m]_OHMrS>RlJ3.cNNHsCVᔕ@TPy1ⵑQN $9pw,E##e!*7UL9NpIJ xQLCH\D"ވR.LI\"jO8KNPJ7*&nL6qݎ3r5 wTwͼ~pOݰݛjdu>)ށi[ }_@S%4:As~h!(N]o;CoE~v7׾cyȎ@55lvmoƿ<q?LMAѲ+m" 7YXSK1RCmA 1}OzveZEZ||y -Q9rEIwCi GcBM܃ Qy "$"0Ӿj+E9"_?A* ܔzQb Cy;B6[6> w47>8}nfڮvk@}x e/- 3^<`\zs^RX{o;Fl+z"EOfR*DfnA*QDĀ5X-EaN掏NVmN`[EYnª vªzVqѨipԆB'T\+i)EgCJj )Es'R RI)CR %(\<,.3\eiٛ,V\}JS*,]%HRz3UnZ`ldw /_?h 4>whCTKn/~uΣ`G6N>IF?cD7eB]!O6FOFGf=JG3ڻH/D B}Rl-J.rPÿK؛|sU񍮞-O/}B&*R#6kʼ0I}>|*B ʻO(w%7T=Ʈtt#ԔK+hekl-TRZ*[KekTRZ*[Kekl-5!Q´T<0= ,.%"ԟ:} 1BQd%@\KUrJ.Uɥ*T%R\K}(ו\:7\KUJ.Uɥ*TEZ呖!BPUɥ*R\ªJ.Uɥ*TT@*5+ߒv/?{GR)mt,kk򦱌T0hu+X:M"]ά\Q]{.R>3rO< 7DP嘰E(#Qqn1%pWE75P21C7D/P IDy%,wt5G1q[NҰmA'V_xVyi$  VYMFu|\~ɰ0mܞG7}tr*ܤͅ/&0ߠr. 77opOooޏ]wLL >WRRf_Cݭycl7fho{\{ PNM&b>}ٽ7-{@r&b7>+7R#$b"Z?'!][H2џK (|P )iFwf 藈lu^_LeZfPvknrZB 6fuU %|@t6C kn>oOhvNM6xCj=&\&'SvewMF\?sſή-V&#\mZs[}:*]o\C/ݐ5 ̠lꔥg[ͷx7f!05D6%Ϩj/ l賩Cqϼj/KWe)U{a՞A\=rAg3iDo hU$6$\c%sM.3z^s7d| yH&2 Go1*T\y!BP,pg}F\Ҙd|y}FuR3=5= TNN0q"!A pN'<(c>1'@5"E* mS+ 2]]s+}ۭ$@ު}۪Ȗ"qȒF#Բ;)'LH@TlsŤNB<Ĉ6$_yZ0j:(&0;K6+yc<w%jj8[#6,fL:`M3ѻ`CQVJK!:c^ l֣48l:#82 YƾXc[Xxsg[ϋ|yr_]0闓cwL1`jc3(eSXe)5TT0ycbC"4 qk8)9p6!PHQqrW o62MqNʍs6򥣬c_Ԇ j/ !WsVYG[J!"7d rPG)Ҩ)cձHrޢF uffXSUsP/HwbՕD<~#Ӂq.8>ED ₈Y|2LnM/&զ&E_mEf(EƠhc%8Zc#Xn,B:Τrjd$gY̽`D64>T&rN{ìd_\q/C߷)4BAճ6BsM))x\1ЬUT_p8<faa?<< FysʶO~( nI~dm ii|R DG&NN>)Y)"dsaw@zsJjن<ͷ,,dqf+p{uJ&YUGv|ۂV/>|@Z[f )pOhjDd$ !RT iQtH}nv>t[zI:HXkW#T b;K%yF`&,) cO+3:B߰61* 1Sr88AW%BXL҇MgǢ$y:;?բ4'ƾg6?N]..7=L~=>Rs+qntWsν篗T*6KorWnd*ʹHj) Aڌ`XAa)Ǫ\1 SRP bJE:IAJ\Jw6sJ6;GuW2a}G[a^Xu- ^=`,(BqkaD`UL.GjYlR"4 e![gAxJX4rk^H_Z5c xA!nY 8zex^#q<:)dzԮڄdJ -f+rhC 7y׍8mjgw#C'CMIhtdwF?|IK#@;o7֢9*bt2\ĪA"Vd,u*ƟRX/c%Qdu=M1:Sԫ*E9V\@1F~V^tvZ.~B0Y`+bl5SԇicOe h]|yzgaDݿR˻RD xhˈˈO3j6j:Q맕d󖉺uɖÕ>.ӎ+@]I{=u=UGm–!8Gm\NuegaljZmz?h(z;>_zfy>ϯjxKJbK7tF֔3rz}_m5w*ҟ˻_f6c`榺&7a]0lJmsNeiN;{\ *),Ow3FsÁn&>ҳՃ[Tn ٮ/<-hV[;^הݜw=b_2uĞȟ; dq#G~ro|}rPxo@  ȷ T0r%◽![3_nΠ?̻OC5UߒdЄс+ܤX(:dO T` `F尀^~>(xcӹ(rokTWIІ}oin oL{>˫ǥ𔽏/'>NSbk|q#zrlNLJa"T?k$)@Zd#3i`Z`GZ$WJCePmljhV{OF %FKQ!gU niu7] V]3&r]#\MeVԣ3իTq$emT^:|;YݜyϚ|p*V{8nY^i5fxz%PIm^ԏҗDRBD2xUibfDTT{Pg3www7 ~!qO8"jK7?͕{Ÿwn}ܛAśb~Ŏ1% Zla.9Fx$2R-!8K,#~$y-J1T`?6lW}nzP, NWE+)9%%P@u@-K71ĮdIb t߿_8/5'J,aȭڸP,$)K2L$`j˓9[o. d=Blc31U,\J*- CcVlbB`BEg;qkR0ʺ( I,RjdDɪVtjp-ǘa\m܎yzJ~sqa&gcU]T䝲WrY!Ljx(h8FoJf4m񮅜wV@&cS>=s.1B/šh-@zɴZorǥ[SPVD'ldMp&<OΩ7‘gѳQW3(1RCsIv x]GM&q]18XnPV@fD6\{S\Jb(쓥RSa=;vH3ک< = ە/M./H~^|lҀofqaGĎ gG 1ȉ>鼝_FTBlLPJ"6LMTJ/Bha@I;ދ`^M;,n uZc*i`5XsfLʍ/zQPDi(; ,W%nhN! Lr17韮a6١"t zbs/+okrm75+-X(+QLo (.1kpw^_]˿@?]6!l[;a.vba mt!yϛs̍~D?x5B)aj&KQ<* 5k@!CxZΥJ,>Hmasno'2W$6k̟^Ng5 (c`"Fjɴc 4`*pه0Hi@fdCekX+,QjSS̉%rNղ92l:{<;˶-7_?t#!b,_+]G6ုui%]G6HuidW~B^ omU#$kDPTM' S}@~B6n H -RH[܂d$Vk j4BVC_ag'-%;>B>:Odn!EΘ4 PLS!)8>btvܧF9VRіֈY.s 웛 sPƌ)R2sRg !CBD-I eI+ (GO󆁙 ~iӧ3B")1<,₉:L^RJ" ʣԧ$"ȸԒS&O)[dgŭc !v*-eDwc^AzSTi˪6Dˋia ,OFL7]~nw=`2 xou P`&~ cqnkr|DɈ!ϛq:} Sc,N|j#߰v$E)ޣeр 6_q0N!Far&h< x."T@؅czR|j~,gzJ~]Uw]I4{@}7(&a\z҆)L@m\ jbp"N3 "*Qts0ΙVS:ҥv ls4o".KbjnU£sr-,d?|gi8x?r|Ez_,=œAUSN4(H l"U /}4`<*ԸԷz ^(~ G П_Wn5Vu\vϫ/x/K|3>+A^si`^u< ͽIMs[p^2 J{a'Ӊ؂ond) 3s"JJ|%dz%$6Z(*6&yP+L6eҟ0< 7vw{0p\ +chUyVY`67`hf0m'‡5S{˜u)hLpɵu0@4.Zm|b:6:AI X&ȫ`]TN[x 91q"i%H"Σ9_}3j</+w pϔL)!UKˁS(add9M"H0ҭl`$e{+㻑 gF P Kg4I IG+xHH%L棲9^AH[ u#9 #-KoKPyLv>WO*֟ÉO΋aIM١WNק̆Fx!fo;_vxo/ZZ!b!Ae< ,j1LL<a U[=T˝|+y{pE،> *nӰuQIE(ލP^}n>/.;yq+"K6䛧ݸ8\7wʭ7Y'ݹy0𬿚w0I }y2[8,.kX+nl}J6[BPO*EB,k~cԊTjz50݅#RWH!G\C䱨+ VJi[u WjzD fBrѠ+CWW@%'-zJ(,9"uk.F]!BBj+RV]Bu%51TBw]!ύwWWHzJT/uڣQW@jG*y^ҔѥR8s9KuZ=r9+-EB(h3~5t2QٓGx_~޽)F93G^2-qccrw0o.B]ObEяbFF]c ?d8v 2Mu |`. 7xg|Tp)jM5#^kEy%wb!˗XKS}tx09ID±7AUpU`QfBvyczޜP!9O'wL:yૺǻ+W< s~(B:YNOφpZ|Vu&㧓mrH$f=2< U 86G<>ZV miueZ)ZSFO1>ŴSִODdr^Eؤ 5hiY4QzN-!a[rV~^^it' O`DHB%&G8Ҋ1-1J-鯶,j{7k5Wl+~4<2u.D.YVKHiVp%OeAڔ`i!S0"x T+obYDȐ`[&B!YfNx&F9­.x1Y WO9#T)bqDCIy)%aFSB22?1rŲG= (nG7"\&ҦOuxmMQA.3(>V4gɛanǓQ/McY{( e<A`&~ cqnk'rܝiT# B7t:rbꩍ|ÒǣbUmBzhtsN_q0N!VDY:rJ#:"T@؅czR|j~Z 2BCCu@uEzEA1 ^a ?0c V-'>` Gj#M0ΙVS:ҥv ls4o".KĂ@IȅGMkr-,d?|gi8it2ޏ\1j5!vǮ*+v2zrdP!opՔ|? 45:RF#j% k52'X<4Ը4uX*y 0<5j\%\Zqٝ?ꦿm ,q]j |+>8{E*UL[W6o>y'7Wޤ&RyI%A % С{ h6~b 8x-xV ܔ}7zݖfTRω&;o78Ok̅HVm{q6Wir=ޑ}V7c ͟y/uO7['r>z4.&=oxthkկUj?%@5L{ovɹ ǥN2JK ti'emޢϛa (@k<Q rLa!KAfbf%yNpɵu0@4.Zm| $%huQ9m^X Ђ#4"]퍜y5 m}N) {=qi9p 9]0T,I F_5eHVw#)a%3Yv2= :uYKG+xH봳lB2DIm|&>)Wm0&GħH 3:[4jT+GTn-W_w#T;EОX J5rRhX3)*%U /E~/!k?v 5vrJCSKLuvW߃ 0(K/$&8˻mICKrFryf5~ | ֊ T(,3sTnYIDh;Ndګ+v[(%KuUYd% 5wki lϮu8u_ fb[?ң/+qepMYz?s3ojvҤ-^gBi%\ g~_m=kzl ArE=Ciq:'ae""[zm  A0ih%kJpj*)\Mj⏵u>f6A6=}@1Қ I (ȣR FtṬP%kSOɷ\ʸilƃXky L׆7iXL:(o`OqSw~uy7pMvҾ*z-ǭ,ؐoR\9UޒXqۗ$u**oڷ=ǕЀ]JԒm}Ë(YCR(Qke6zf_8uˣ48(}4.]s__?$5 hZ.[\zl=;|+A=ʿ%_V,Ū,EOMO|Ū%g'YTQi+dq;vO%)tfz3j{z(xΡN҈ ca0ƞӍ7Rgn gn ҍIU6[&4e`JT"*镳}^1M+}XM _$KNeN>I)2*|$ )ʉxD&&E3uoR9MY޻BOy~Tm]3,Q oLi-}A^Nx RԔ-VHj~9tQ-!W|ǓtAݑ `Zg|Qd!DF=P6*OJ2t(JNc,2< biWc\RH"AgL6bUDgeFo\FD Y'1) LJ<|9^R,4 y3ϔIg),B6V#:C&~5iƫ9WɐFqby U^?ɨB6C,%d_rESѦ,t4¿&tw}RA+I53s#Abkyf\{9ZK|?c\45:'*z")8qw9꼶;z:;ka H6Kt9DF}SLY0z M1X$"D-OW^zT=A"Qj̔@2u^ُi'guKbKmRhdۏ0v9_M_>nM, 74߼R|k&5?IZl! 3>+y"A v|U' %])ۼoHo͆ 8!sR&> d-*-EG@ecg8ۢt:?jʭ~+!-SO{/ڠAĄfZZ+cWTV[]rN GcG]FFX5ªjY?7Z(2}c8j`*!+A*gm_5siʗ/ubZ2Ny~>tDy%RH&9#X:Qq<31.?G>3AB1z WK  PS)q456琳E9ᢓ>g*R DRAmPHبZG7S@PsZڢ*5O @d~tNDI%ʮ9ǓIl;G v]_z'ivůzTxFѫ{z>Kc@7 fl_ |5{Xb`*a|31 ˾G3P+Ͳ} r ֆQ7n {H2_1Yk+QQ H>_w[Ecdè*3eܳ8ܧ{.vi!|D[ T\O:6Ws|O'@kF%P "SN->lUJŢ.l>EB6*W›( SerPIU9&j fü?^<L~l*(6l*~*KٛMp!& >al!D:o5T"" IV8rSJhRɩr3J`3i H1vR=c3u{vX/lfB|T@1eot;] ?e.L/˟mՁs6`,`L6hc@/tC\@J@ Tu`l2Ȟd vWݒrCZJ;ήi+<>Vq׆^F="].%(%={- Ԕ&P>=Q/(Am)Ry`ِ9BfdȚɜoEQɨ\2 GFlChA^:sX="q;ϨQ:  k[9@Q2@iaN wRcc/J,RLf FRHZ`%&Skf1DsZlf%E//~qץzkE] ꪞΰ5>#L>cS//?lfC \Xj7 ~YYٻdIF1l}Z| 鱐N2,Pd  ѥdƤǟ5Q]ң|zuZ*S%[g>H'+f4bX(j|2F v5Eр}eBeQCa)RydH!~(7s`_ˣ?&5M k8ѕT Y%it ubM[eN.IVV|;l噺-~A;%]Y/Ϳ.眾Ke9~p3(ᢄt;9,E>w=ےbc&.DD{(HuCR͝T/_{ EՊj9Loojq>!}x|V>p}P?DZyȡG3$WY6cm4>ծZ]dY?. zw_yޥxKww"%eG=-:er] r^\3H@{ -^pQn/Q}.\/w<Xw:91d>dǡ=ulr<-Dm,/..'(/eJ-Ţf^n~~-;ЬnX}]FgI<a1WmxV>9!xcoo?|YqwGMjA+[=oo$8=7-bă8CA&@Yr\s6"8V^dƐ})[eUDݧɃUOC_,8xbVk5{޸ߺI3'ӵ'WU] HKMvJgFy r~stN!CJ@/$I^(#:fbf[OCQɌNӫʇno?ݜ Ae+xmq7̉ PS#TDaXj{YdmsE8̻8^pwhd'ڑ4M1zb'8JtR;2'Qqw"XMX'3)9AX~3AV}zӅ_Z`U,0ZUd R(klYNy˘Pِ\4 V K5'P9 #kC^gnfǩ1v}/թ^O!_ͮ? "@ơhѡmuɜOczH;%CU}V㸲{@帮dVV(Um2pkj+Tmpb$瑏759!Cv2adX7&}|uXÓ.fM)7nO?8j}Qx]ӫ8>`Ξ4-\npQ~ݷiѮn(uW4/d!kD| DoBhrDNiHU8tD{ ?o-oք|æFzRݹsjh qCV_ۻ.4fToKLzyҢQz" ub%*QH+|-:zB9 S;=f/ 4yʤɢG kkeցˤ6AeTU]LYi GnP{45 +L2P!Zd61j`p:γf^*끁1>\T@ Z^ -)Gt@+P@xT\2"֞R9M!/e/Oyh30+5P*6Pq Wgcbk0.8Q"xn0>`DaJF"rڧ58!Q6& i ^o' T `L4#եNV{H2\ $Lmv=ee 6Do# t)^"H Lrs$t#;Q;ɪ`a ^J1m8=b ev 6B`7Aia 2aߗ7KuNѧXnkD[9$Lq񍓀MVBJi<]kJ)^xNJ:cJD&KZ \q##}2GrA;ܲ/8ItQygBHphV Pеs; 5hW )%bv2y`+c;{}$im j*+=U? Nwg!h!vSe~ Ni >b5Osǽ̪O6 /];h`8=# :j:$ J9?xEj%ʔk[uUZMi܎-vcӺQ#Ԓ4QQB"FsPГ6eu TT\H K^">JFBA=ƘP 5I)Cg /D&2)3U>#ξva_IbIb*q-1˜\7pw>_'_W@~.Qr^bxiRˍP^j'44|Sϭϭ /?pCS͔E%BFV9es418U)3 K)BjMc[sno'tWtO9 ϗ9DiNkh(zV{^L4pTƣrU :a=G;\Y븋D+3>r_::2(p4 3^Q.u֣`9ߦh,ɲ܏e蟚e aI{jK#B =QB 8=:GI]`|**Jh %2>!I&]ߒ{q?mP.P'7-5(E I:L`䉄şWo.fWK+z$7VN[ɺg+Զf%UuU]2ibzǰt%Gl,W\,֎ޒ#O!Zr 2#J̆dٔHhAJ9ˡ+ޱ1=?]U{>U/ AW|m ƟyPJȆ\ t##ҲUf?x]4!8ˬXťӰ/G)R4զ翬7^m?߽dU1mKW1Yh}ܫ;uBs(ḙ 7'"nNhdBiqf tFtȆ\s+B˹;]%z] ]!p&#"u(Jp} `0vJ(a|$ ]`%e6tEpF]%֌RFTDWCW 9+(U2Jh;]%4 St)g'OP |cPvlySK/yFѰDsGˢsivN㼠|8J'MȳV[1Vr/>3O%?Xf.T-'Fzt _oSր=qзWZ3lr.mA.8)~[q~Sq*6^HvVɓJjX5a&!ջXί[[LU7@oΎ7ͽcvBgxv6{-3׫. X$-򠆶-ҫ tֽ)7<^Pua]eu'ԯS9MGB#>W]yCZ]ivUU+k؀X-$wG; in%6.>YMo->z8ZY~fëZ"w0Yʈ$2\]$rk~O?iJ)ŝ*>%Y=PCom"Im; vV߭r?/Ȉ>n6D+wKW\t^O}}=Py.8I&[A{C6j;d+f06~Zu@jUlUk,"LE/P*'~/z>PRe}RVrݸ>jmyB9퓿 ]h6 x{]C {YIAW0ն]-eDW 2Ukt.tEhUB 0ҕ̈|*w~h a]"]UBDWx>` 6Jh;]%Ltut e.tEhžlpJ1LdDW32\V;J&:@L*i={I~pm6 |*>!G&G|>!üQy:c`kx.t 3>vJ(*19u;5D6n2WH.#B RcՓ+EyE.h\b-! GWG}sG8nX~qs$Qde*v`q3 Z 2WZK.9dԍaT#giPRT,1W\)S1WZRb+j s%=E/_\,asF6WaԚ/DRv%̕ꡢFV +$X*B*%\=Cs 1&d8䋷#W$,nJJM6W\"HB OPXʒ+L*R3T2hQmI\~ѣ+S34WIR2W`3h5<s*=Tzk0fOڇ g1o*ǫZB$ȯ REMҁ ,:z`/"9\BOA#0*dk 2#/۳BL^  8r@Ȟ 2%Q'7J'xB\XHU$$ie|OQ[B/ޏmÀs8c[@iӎ 1'oݒRzy5F$_;m*TA;&b3+aݴx^ԓ ]ƪ}_8`+YDGnqX^So * ~WUVdC/Vߨ> ݖhosģ^ʓ|Gr;f剥t2VW]tؽ:rbA+mm-n] }k_y+X:8G^/8fǯ#37nq~y۬|!l3?]ѧ~8oi uO)b~~-+-&_@y{p~F^T:}?<L-. 7I59?\t;@JcS]fz~BO'c|.ߓ"LJw%7WͿw.F^^ei$+`AUuR U[~{qcv ΅Kμ>'l^\S?P1">LxRֻyFB쩍|N3lr#=nK!pt{d2 ˳]boc8ŴMJ1m̀XJu-6G@,%Ma Hɂr i4l5Z]j9R|:oqF9jCoҨ#HI*}/Uj9CZߋp^ԵwLo^҆nHO|52t;M)QTABg9#RP2pڏϮG#TE#?N.8>⠜v$=);`;g#[|'-B r=c.8p%Bc.y\30a Trȫ>YF*iWu1mޟOݑOUr|9>d94| #G9_5.s" |/q͐ju!cl[_[wT2#s$$ˑM[5PCi\3$' Bmsœ~lFcZRl||5Ƈ)e bII YrWT P*c1<{8BYw3帨.B`zqk1Fn^!G Qƨ;d< i&0K.5i ?>Ry L ?fBzAjzO:G@Y|@fWpHPoV*roQH{s)|p'_ӵ/E< ?;Q{cNWRuo[s+]5 mUt~?LƳ}g.pf@aB{2-R/5 0rQiW/] ѵ-Zwͫr^7#N[棒rĹl4 ͈qк,+CL5iG#6W*~VozM\SQ#(; Q nG5VRfTL-׻{[LZLPlJ0pN(\3JL! V|(ciĘJ= qu׬Kdb߽)&!*+%dM`%DYȩ6ȸ2i\S{GO!9q44heEnjfcxEF * 94&YcQ3;E)Ht>p4ǍYB2wOQC 橛% ;&;)VA GR!+!+Gބ0kryziuMGeU}TTӳ+'sO6 >=,|7,%G{+ H! gN^Qড5kQB!8Z ԻKYrn\ϳhZNgYmm#ڊ?IP_rWu:r+x>;==ٔeEE+ -%nQZ/ }o~F|~b 㔀M7YQ8ljA73muff9ؾnY `v<5 *O,/FXԁَ2pHѭ nkƼZz/[}kbkX>`ayVuwlí)m{KLh V^&{`q`pjm-iSorx5QX79-ӖGR+P7ϫ{~{8C M̈́ Y쌤rg?P׳J0+x;%l1gqdЩ#"N̐d٤llXr2Wd͒Mz֎IrjTܘճ< e f KǞ^3F8k.Dc J“) cTjF:0;ai^9͇D$olQ8Rw!ݺP&rw؟/9¸g 1cɳ"\t% Sa|t<>,Y1 ' P3($Yy&J{VL 5=aVi z5f,zO )Ӊ->AL+V0)h8Ajrn%$b'ɪl\_]m\s2_Ve"@ Q^h.EBv)Aaj) RYIMFKY~%|yI ?$5LOnwϳ2^rB<+ I&p\#1u@s{\)nuO_a|-Ymoyy ENbډ+&e0˺&Yr`HV#|ԴVuY(Y޶\kZW#֔vb2Zح);j^á9l6tNCZ=pv?t2_hM}35vxzP/L9a}zC N2ɖW 9<2sl/Aˎ_a X0-|C qNg&t5FC-ܟ!Oq9_[w&9жcV߼=8?2< &:}?b @5ȝ'r̘f | :R\,H*TE6Ψ ן^}1BaG0g߶ [T PPLS!Y'S (su]LM_JC:40+k E s2Z2._dȥ1)|S|O *{WCc|;-iʻNP=1-RAXA|svK#HRNq 1F@ Xr03K8d:iuV&@ad$"ēqet|'a]8U'I$d38̪d8 (O0Zi"hڋhOM*o|qD5' iLU#8A-tRƀԥ HT:MeNI,<Ę$>R{!j 9֑I%j$dN:")"Y M#n&\KV ֯A8-ԳB.FGg ] hNB9H#1Di 3%⨄qBnv}tG, $gq]WUIxEcq =*xt_@AkN\$f rl6K9]8"YAs3$:J⤊/!Eh251RI9!Hٻsڴ㦚O\+Y.o^~ye֎|{GX^|0/0N&*|+gOH*,(CXxT-l3XHS5so?{8DfhD̡c{{؍9Ą-[oe%Q2)JrXd&DD+[+t^v6N{J(I7!.Ѩ_1[jd@0asN'I^HN?Ar ֐q:pCZ# 1^^n$ۻw$`si˜p#Q!S:F$)̧ ܠq/GFWK,OU0KE|qW-\FyXjH!\ntY4G׫ {%8"ΛI ?p4qp*ҞTl1P("};AhlO2//ؽVAAЦ)f:JeGM06wcH?09yNJ!^@0oUv?.֫w;T? &:GRcoυз$P)֣QFHg "dZx 0$.0PŖ ~ 8\IV2eM_ij%HQz0E#> p^w8*F毋=#-9tO-`Hֵ*<2zlnhhh|t5*)8ٓkTv F2#.j\i8L +tl8끿8/3u}P;KyO$@{̺\lY=@׳md>J VySd1vRQWD]DmOG8?7:~0]r~EqT&NpѓDF>DXLR3ba_L) FS8]1vo=bXB@F O_i|\4O[9vFMS'\Rƾsy 25Ŭ'rgΉ> D0l7 , t!3Imk_+eR$y}-ὂ~S0Â'vKr4*uqk̚LCS_s(~˒/rbKn{'rB~5d]+dmIN$/^0$C 1TgFw~'z؞׆` "r>8(B;a8 Ilh*Rq'yZKBwQiy?.:Vzh4?CBsd2gp=0'FޝV_>F5>q=`@+ H6v. 8QhͥME@isM\&(;5ei%v;|@1%VߓJt|" Ú~r] 0d y]ӂsZ:9O%? {ZBۆ,&#"[W0s~ٜ#^װ(($oNJ;EcLsW#'s72]nPqr29k2#ŞwB `2(*3C~7?Ε`51/S%?G/"5FK7Pr mZ~6]O$a(] .LsNjo!0t?)NjtX_vCyڕ GK!] :+m t ɂE'W832J$OKX<8R#80h3^6̓%|!t:ބfGH)=XdƟ9Feyp8Oĭ0ZogaXQFTvS./ߍUǃ\#lsԼozHl< r4f%CS`86VLLrhv1*u2D܉0:~V+$8F@*mjr:[*GF/#CR7T}( >i5aG3ML_7 ׉8_̧l˖\mWҫѿfsz]*4"o?g͡~ӟ_of:+~FMGrG3NIF xe#h🿕ony+R?*0*sh@d&DfC!&\;<ΖVPiZTD͂jE;|vrИ!˽lQ`lOzٷlOG9R&8h1%ȁT[14&VPj&I 82^9p yG++$$)|2 oqڑ4L!J|,74cc۾8ۼŻakzԤWxڱuDͼ)MaNJGR7xfVlZ(C'ؕl+w442~gzܗY*z%%ԦH8F9ն-ETnCC-aUtYOo[@Y/XoX+k{| 8jqCuuKs%ak5gԫ"fx4ixNjcku_O_vw8,{bc*W/V}^7ծ;uS6CIC~dh_mki*nIOU{3-70x/<$1,Y9w;}xW-p:_ֵ4qz?Aǝc{M ]t?_eXzãzI%FF4+Q@ `v2L#؆Z{I20֫qRtdrEd?j˯LETdwg\h`b?¬r#bo-V- +[47_qt+ }1JB*LijpkYW<0Ta`KknyCT')=o7^-K#1|DmPK]Ϙp% È&cރO^w$khW`@Wd&UFsRh5V&N瘸F`d红!|F.;FJzoK?lj~E jJUh=+MIiA, HZk{C"oր{ N`[+qϓzD@Tš82&z쀜^hXUlέ jL7Yt2Gi]-%sd(D%[qyh[|}\6*x$Qrd909Y޿I.ęmů--xkR|6̒khZEbEr.CIfxބ x{A o[dD}rw:R)zU$y}ȞYɻGqԛ-АG3Et-= %ixX̂w0Di`͌fH`u|@1Rk!n=Y`/LgLJNUt5S*/y9.eR#R9wLǼW_I^e.'߾QVF7bя 3r(Zh:RDƙ 4zI턛0ƴ1Ѕw" ĐNmpl@nRdJu3ps, 8P71Tl73 q{3S[_^>r}CyPwE_2,p14fgl+y;\+gոWJpMTKlp'5ֆCy֘\ڣoH] WK[bbŅk a\5pXow [ɭoQXmT6Ln_G^XIaVth~a9{, '0Lzn co(B|8Oao< 0z}masRSh$W^f=ia" | 8gYѐD`(D&^}3?A5*He>I3ǭb7FΈ+cfćɷW}6 dV=̴^V}6۾DQ (V0?ިdP"TV Y+P40 C rוa8(J`eHӵZ}CUSh=V rEh>~#S:K+2$`o!h'h4wXFHQo\.uЮG]W̝:'A3aEQYLFlMc ޝ>$ g;|a]nYz3ahjˀ04{ CH p WN ђ0>CU.,j*o^uAQen /FĜWߐisӹv&.}I'FQpqNpc4(h?D 7 ;`lhK=ԹQ5/y{s?KЁg(.Fɓl)U*Ԡl,»yi 4N2d jz>q!D tielȣ~7Ku^?^}@r 6d9-h2Yjh ҂&oKY(6h z1~zIv1&/؄?I*/1Z ::<SwMfx@["O9Bp>"{sOoo59P'(j5B13/gƭ%Y91ۆL_No |辕{Ư mh?|}LV3!01䧆j܁  /9R@P,"MyFO|f\Nhak%EC(Rj[_gRRa %}N.)榚X(݀pBI! v+'AWޚz55`pP %%SVv+*R2~/W@zc#TTOȚ Z޵6$",E2Asin}!mYWK3A{lQ)5Ey8#J<̒KSة-lNs[kRr1f3o|Bff#y||[bi/2_ W0 їz`na[!沆pv J5n_r;>^x>~sД[oretX'm9Z/XD\da<Ұ/a}Wg0WB]ul2c(y 8m ƃŊV>g1yU߃Ƙ05欕fͱs]]M00_0& 9 \j j>5+]~4VaƒsڌOY T8a/br0\ h`cd rZAr97.{`:,w _{߾>.~?=,a)~ |ʲ_Ďd? t_5i]/p~+`1h0,*?Q_p53&$vo;vEH 6zgZXv]ikTa⼱#ӱl׈q"8fm9ͬ|9nx{>m1ϝb.$se\=P 9]qrޗ*{5CmZ y {Oم:̟Տ^ Lu$*Z_50i݉Q}v9-gZ[~>g Yd{. yr=k#膟!7( b=Kq$$!S)F8M- c/(^GYM-8Brr3B <Ł_= Vڴ9<#*QSqbQV#, }]|qsRHrD_riFk~fNڱ`׆)NB' ɴEV3d:õw;BJ[۳'f1b X7q%?>[n713.Ep<8D(Z 4L!):h9^ε_.F(jy Mo8zhXRAMQr;IG=]v2˅[N$ffNɦ:ξy$&HR-N4vG}̗K]M 7DqND*Pi/5Khe^+B`R3Ơ7N"6[`ȱhU1 #PU0M+4a%2T9zab.cJdt-+E?N}q/abZPɨ$ʸd0q?%,’:I2 /_wWAž=5xiXK?dU$3i9dG2|ՒͦTi8n?x'JQSFE~$Lmk>-Ot4m7ظŅ WM eVYhbfNKj$z^U~eTixϺ̅ &|D֜8xOs%[:> Z*"i8N;pnDuN]a8 ME/1&趕 t|`y)ܫFV׹GmG:rU͌*eC]r9׊]-S 7){0Ouسl5nH-mk<u@`&N]e%?y{%q6AdX(+ IS^Z?ݹߩ5˦歱 Y%S /r*',9(]tv'l/YHaucѳ(Fty/կ>Z듙yU T!<LYs/!TQ% ɘ=!Lmr_ieUϛU=jHp|ٙcCOiV T ߆ϮJIpxl0y* yvҢ0a0A!9:-&\/S +MC3ޠq Si%FL* &Nlv.ՎJ!T ؔ =83VfFnI#M'ւbݤyLQP؋@R00GZi|b/` 7A9Yc jZ9VeiC`T,3|q0Tv=YPcRrB@,(2 Nc}PR2Z, U^#P)R >1J^Zx)g;v}ފfėT P dt ԷŹuD~qcąڣw.(bVfP-,QmxId4<J( *M[O).4Y=!Ra(Q?*VBTYDShvVP|xY|##CgF$EmJ8E:'#d )ԃd_Tj922g)K!`4炤X ȾK;&K%W(~<+a"k&I*~9ORiF72#كkS5?-W>"|-_:G6XwJoJHP#N([T`,B#1袌h2AY0;FR!|r1kᦈh8NqEX[A4Y1TD}iZUIMeHv(YGZ[$ƉÄ =xkk.*FH q`ӤEr_=&XDe3ޅ4-eC6ƍ(vN8[,Ѳ<# mՏOKNa,Sb\bJ;!x[-e`w$G3CY9PNk/?6DOx#HcYh3z QDȀ-0r> .lMȫ,&DG-93Gc$/%ztϹ,0pOnIF01.V"cSor:L&>>禂cv;(T(qyUbbi$]ƃ< j6;.$qŤp4~|2RY fQ!R1w d>.}ZGmz~z8ir{(g6'gti%;Q%2:/$bJΫxSRq&^b%JdyfxV1dE-pUuߣ{션*հ f'ۢcŝ;p>Y{P,W%RqN3' 5ѩJ{<; #ÕH/>/F: 'DKS֤'%g/}h6 JA vNZ뱹hcTMR̜HOfR Uwln ͹7+4I_M>:FΑBl0z"V'%KEАEG|ŔͧG50]~cWV}j^["=_$dI/GNr0z7ȢčU-(UTs+Du|;Ʈx=ZT{gKbOKQiL%2>WB.㏪鶿v+ַ orV!Zn&' 7JdŭC(!5mubjHS%q&Ӑh.cQLpPEM{Tq[6I-%e-1ln͚gF8UwHA gyAIY$D{/M/ldls>L93F #SxZJӏY6igd匏Mvn66Q*d-.\L]u7ui}+kٽIXh_T)dƪn,Jh9N*DFKdt9R*iu6=:;~Z?Z P@fHMtMj2LOȝDKdt$d4]J@+$w:D7o5dU,-;t'+JHfLs{sSM"?p+IH}3aԡ"%2f4T1QT#k4Bp-ld J<'yn,HVA!U*'N0bfd[s1WGKdtyU]=n֢u5> &N(Em-*_u0|e_(tnzLR;[4/.Y"?jǸղ=5 v24eَ즮1b\rZ]7g8=SL\XׯB%i.K?0)2Q&1Xj'oXͬfqi;)Wڤ =ɡʻ1R&E΢ZN 4%:PFqվ;|H,8Hn7͛cl:OT̀ɪoj;~iuiE.W+In8.\NבQRu5~Xe| ()ʡp%N1$?JQ~oB| 09HLƒtmƊ7;?q|ZLI?W) o2NkmV6V Z4vuw0GA]ǸGtӓ6憺MnZb nEb6Z 0mɒ$׸'uZJ` &`T܁9^zM'ji:#1ȸ/lYoTJ!Ф, N P`x+in8cNE־LHlGX~+ ̐3dci lBPTeu}U_f3Y TG#2y<ɜeFroq> !PLmjHwVn9Ӌ+rbʆX^XqQN=[Z7P"*#NrE3xCRML}Q(3_}L "nSr7}J ޢ;%BUHH Z.%'a砕ܶD_F=虀@FQe\r87 G86X XGK>w#4,_y f ;L>:E:F4CZg˵a3mD3QIGc.5:Qk/,98؃a)ׄr*PX(' %'&ٍsZ* |]_SX_0@Q:&g~~޷oi8[v-}}>aV~`goV/׵ZLt|o @cXGî||V0p`&Yr3_ :r-*bb4͎rt4]|/8ָ\Q5V [6+4\| .EBʣFL,g-x.C㷥t1-Jo$SɲchJÎng(kSCuP(TEsO&{76E4N0|P(y$$[qBSGJ0N su:'a2,F2Lo߃w[Fl%leм"^[fk(T( o]cJ^C۱S|&eDNU6hGu,6B宜.?[:jY=>iJ->w~d9M{ح(0SM8FW_6'1hN@)')SKѴ@FlxעlM& _mRvYJ[Yۍ'{;%DK4Ը Gi >?8 DX majf|+wE R뤷tq.9n~MOWPiWcdrsz)wQ̨f$5ȵc`j^N: m\h,yY]1.rѺ9@}|0㳌/}kgÄӋOFC\-3bf4> ?S:Wg+e=3)軧hKKm\!pԬcvC5'A"] ee}@i}2;fJV4a唋oDQC9ך/N5Y"l1l./5ge%qE\7[PG B_i_kzF 1Ÿ8.cZδĉڟNE2*sNP2ɝeĻU~*^lj)0Sʉ$`• b$su{O0U{e lG3jNfcevFۺ-tϾO:CT*3xGJ7O*FRn4%O{=F(N?mgC(7$:oa @f7o@yϙg+섶T9Lo/ ?}c7_+pd(D}{t۽?gJJ}~<9[H \^_ CXO*pr7h_&fƶ2WK 3{  g,§~,gҾ#_Gڗdb<p >grsۿ.WeiR_Tqgwd:ĝj|nԷl`X"TT5p:!@AZrE?Ry" xn Z:)wcfjؐuq⻧XSɸh~jdQͬ^NQƩކr.V$WqSryFaʒā,yR9ʿ"Wޕu[~6@4B*SFR)sYLlf(J+< s`7sDo4F@eCQ i'vEO^zNȸT" ,abD{R*F$"$X*&).Hi"Hs8ف"2ϞVAmÝK;LۜSͫn/IY=%u7oH2miV% +bYB ^n8S}[98L&?|ޒGWC1_ztyM98ޒ{P@.̗%qQZ.7i7C3MmNykW=CU7ehI=~5 }y, +4<>C*E5LS.}xJQ G@>k~EJDJ3ef/e8,I쌾zxXy7Nt !"qx]nsiՕsV.⩳Wc S X=U,د% 6oCa  z3p5MQ y rn a)7Jn2<el -L'mNҔ!#4L*#iIY4').TĎS>!ta`rMi|)9T`e.GVrFBkβ|~,QM(*%ą[&> ŵ\)Q*b O}BH9a] #єg1f%YqHTSODit?<~f68:ڵQJ IK)#l sK}0 57|3d2F;.rYʻ-#R}@!pzl DmkZ4łm:oE:DcqO&eSEэ_J5*n[]}/Eh=?p׹kKkV OHF$II4,|ɔ,&Έw3^Br ekTvأ4^QOלm+C)c"!2GTc N굀 BL=T(*q`?Lc.M,dWmR+Xٷ bX\ӖB5 Wӧ{qC=ȸG3-o-P[#hewJ&qHSK#ڣ|e f >0<]~T4h.DžT7 ؽapMexfB³v{½+'˘-)"a8NMt`wixLhKOeC&xPP2:,XcH^U'S̷zMvwnw ZkaP d\c3qmƛ~,GƞVFp  m5zNCS(Pq M|-(HJonm8+eTl8{v=a0GΖFo EWBK4.k<8{w#P5Q軳.ltD[( eS<|ǤB/q<\1ClIDQX±z(,uf׊V V~f9 8E}\j=K߷,`lXBJNDK-n^2L.IFL _vt98 L4zsQ h ٧@[b`Ϋ_)B0}A:`v›À /1*_} F ̆FKH)|h/{(x2Mf^ϩ?y4:xk7G>Dz_w;*s.`}˴^ME {|l-}s,a9ccl047,X}]:q_~zy[nbàY힘ώ}xp֝Z$*y4pA`gvA;غE]|oqGt1S7Vh\5CŽf |wZri ov"ݡ}F|3bLɏکY ǧSKnkT(RvG&rߜEN̲u69qO6kZ/PAǙYW㚹p}Wz~jfpvA[cwZd+2I_|t:: _c@~l2 &t!ot_ Fp5K/RVN|J&Ck-Mֽxm@v t9 nsO&՛h"l |[8P׺}Z>֗(=_@}߁j AUy}RI9n9(s}!!+ KD;;`5՗NHM/Ũu}vP UyG_U_eHq/<`hJB ObE4FYUT*K)I缇#@=YgS<67K|YݑR?P!6?6Lo*0EʁL~>xfiaJ4,SIGc.5:Su8'',We“1QGa58LN0Ǝ- g?O.fߋ`~8zJe+p7Oڏ/"(!Y߆߽T <#rgRPb(>.-3{T [Q(;YFJd[q}؊H)vvB?9C51—X)mlSKV`5l.W2zR,jCs[ԛR05{ jA ޠwYToPp`Ԗ]A-T62ԽA}s`|myQm7?df%8.v#Yb'nSVk׻ȇìB"߁{V:g~[U%D.# #4rT @Van;lT!iL,gd>N 1̈],jzoA-9.zo8\@ˢ; eUV4Bt9WǡBRw(doaPu.l}F3ouBi5a!#f` MJ3HL/߫GS_=AH' 9a4瓻1rg EϢʛ0 Z 4VD$MS8.T))?k_{hpZsIB1`&q <&48ASjy:޵6rk/mp9$ȇI"c-[{챤f<HWge{ 'hӂ H2N K?|mJ.hkѴ l z4ћ279rAq<,@ 4V-ZWl|oJ'VޔM um \k-a)XY^v` kcJϔ D,!"hZրY-R:3i{+zhkÐFDٞFpM 22+ɸLcyvh NoI6(ϵ4@1+;ݺ A 9E*0˥eGs'-yL5ocA`/n-`mɳp=(~ri"/rH{: b8>V$XZz>k\h]fs5lEp$Ta@~'+U[|vAZ-BI/osXOD7e6a˯_dRNr7OZbRlJ'74j8"^T]XEgxB22@bUǂˉHKPTm%;)Jt|q]|o7޳NuN7]*0ce{q9&ҩpn(Di|t7׾s$K?T"HIhZѽAkBqCf|zQZ _?߬aڧxUdžwplȂ:Y*^]tpVNZDŽ(i.@ Y\RKǫ0j1tZ$#?VSڣ,ZZFWMƉ$f2ٗ]Gyo[K._7 rgjNK kX=D1ֆ}0yt7~/9α>ڿb=g 8`t~W7[@6^p&m0{)0Z]{"C}~N86,&MOڑF&}%4,mmL,qSӸ3ޕ/բԠdyvu5=׿/Z|ͪo?>OO~?VU |;?ϗ p;F +)4d|&{8lD&pKvNkvDpGh\B-QpjG H[ן?5P?Wbw/^?qh?Y:{#q(TcRgg򡓫fNvp+ ̄oWl*fȌSQׁ'7(Ak6q[&A `h-P S^b 48voáPJL"3Y\ =H ~~nSr Xv xL ` Iz̨3Ps.>P_$Yh6BZ.Di4fԍuͬ$ZG۠ѰP ݬ:t)Q`tRefs kp*;LXb^u' $F "S^xtwZ468!r\\LZ[f9[z>دqiAwlmʀx&s$0bbrRGks_V;+C.#9qY~J/XB:{kh=*#a'd d T9)}`S+OfĽ<#Hm Y.?w 1c:;I3e؇h_!5wW\zc:ݘN U7hO-kOgŭݗb.It[EIL &|}!K]_\^C=Y:;Q{83x"58Yw7C)xeAb)դ0R"v1JU-ӗ+ NZf:ԭ6X]SA*g`/\6rϧ*sĭbKLrZc5$րI[M!_ vo]=癜:FT=uSJ}VgYһR32$=q 'f  Zxꄀg\i}f=(ꢷZH>tL ؀VKyX&9^UC?o1 Ut*Fi)6}J 2Scglc=4'u,84CWcӡgUϞ-ANtL"\blih"3a8hc5iA |ԁ(h%"vX~dΎ!Uࢴl&xM"u<'et e_'@%فރ\΁ڬ9iGhz8ʉ8SdAx2ܟ_ղJ7m32tL]/'$0>;CcA"$㕈Xm6\bӢ?6hC)9m,ٌZawQ9 1}v$ ܵaCJ:]*WMUKF[ i|zkRUR{OGa[N"-k{;435g[zmu' upu؃nW-BjǘTN`ʢ1SȚ5dQ5QB!p12P.ZET\:,X14ƄAɤG:UxV+X 9-QZ|Q6(I6Qy9|Um,7E0DeyO{ԅ{gvJVTfQFr [8k0g~7=:B 2ygF2%9=P 'Rc3w}rg>-3l;yڊ!OgvJrpGO?|?xjw|0mmb̩*\ػ{{.3+֑U$bHM&krk7߆yo$lfRI0/|`j5%V%I,!_|3Sβ(Ij U&&l I [kT葰dm$TUcrB^,PI (-OR)Rͬ0FJA2t55(rR-˲|钱UC;rUQc-v([$Zt2J8F*Q|,R1 LceA*Q2\KoSw.;3Wx Zh%>a m;]ѫ2׉)D2gDQZF?3G˚ Xc%ْ5ڙ[XLD 99{1_ͼB5C.#yy"qDcE=Nn =qr\=OFͰT)KS жN>9Bm I L,_ {'tkI3A>r:qc`VB,=!uhǽ^cd N*b$UbXj"e ZI udq.YzdfL+뛯bX/C0&))"R%|ѱn)}& X_GfBZ$NU幋UFYf%[cr Q^1U:^.ߘbV[e,{yǛ.<6{ //v$hG79op+P&-;b87.^ vx߬_|-+]o8sp>?Fנ|Md~sX?Wtj6ܳR//XؒxWE|:߈XVR_8Fgv. /ƭě71ҭ,s}`ov>x`ZC ՌԜ): @Blk$oY@XШz$Ǻ@,1WH H-2fS%2A1:d}p:s-UB^2*@BT1{I֣=5L۴+Z;)*GH0hIĈA7kC#T2A*Nrd**j `Kla@T"hF=))Իj}?oXѐ92%2;S2&fWy9w\]BeںC+Y@}U"-N$oẃdkuf6@ tuEnʦSȘXCHQk?<I]xj5*43DK3M!ifc؛%ErvǗ\ۢuD&Z7급aq}zDF?Z2Wt_Phb;ʈv&vɒED2.-@Ѳ 0`{{dZޝ&/h#ri*nERȂvdl oW0OP ̖#Rh l"d35a6a"b\:Dl@Ypkkt,ؔNIO%yTjUy)]ɫBR=`h)d}Xxgol-|uFWp/r_|~WW'^@M׽#~!u3}a`nqXNx Å-oLQ__zU_Ò/zQ=xQ}usNԞeZӚ`ūa{=8ny鑸%H2wيO&4n'ū-_l̾ &M9hђxogRi^z wAUE4L.Nhصw J22@9==]scc>_lmy3P)>Ǣ9&X${Icػ-iKf' tiixiiGx;D`v{tQR :욊=-;f'uwt]εoQ5Ry59]'Fqy_( <^vkyK>6u]:̮i'B@t>{xZ =.&l:7̜c(k7oXo B7^{}R>K|Q+#橸}>ۇAjۧbrLnM(@܁QF^\̗~<9AJ,rg;'bGgGRt?_a6[cڳX~!7t(-s]87uOT^KGlSRs5H~Ha+@6[g+]x,@adnQ-?o~dd@1ꔭ&Z1Ŗ !5%DJ2C$su޴P~k9USY._ܽl-Z^.j-jӚ 1)De{#ڊ_B;X`TޡW<`^t^-/j0z(ZֲOp }~W~ۗwKMɥ1AQR,kqQp[gN5: 9NCަEZe-Ej)VKQZbXKqU&LaUNU,(P0ngΒZnmJQQ.#`e_{;v؃c6l=-j:Z%c"Jd̀1E8L'[&ƐߨρRrq38?]ո8q/1d$An[pqz(t=X'dc`wYĜм&+knVŕ$h7j;m*X=ęiZ(u(őil@FF/@Ef7C=uM2w3;g6#Ey3Rq!u[OoӁ⚹@&{bϤzoRCC=udФ6^okkrګOl)䛇>$5WWr2kc"c mRM`ixϱpČT:!3ʵ>Xtv14ju2E:E0[|8jUfN&6|ӦȔ@\GQS(0ܠyx\DvMW@ a=}W< (w{yW?%) 0˭Ex^E=?odZAdNu,e8wV8T7! (60]x"| BesF|p%AS"K g'$zOc_گ߽53ACH PmBŵݒ\C!F@S#L;-^ejTͨy@!OLi5@(fK2S8HBNSaΆ䣂iVXN%}dwhQjsU4DCa_B/c{\ ;.k%EVg>0b*k,9Lb9F[t?`q-XAsp0˽nҦ{MO[Ohk$ nyK%E Υ$ʒ) ^z3r ڢ!C-v;1JʯEc#ۑjܛOvbOhs!Re0/fh]Ü{"j,nݱfLث}5J`*1@rsΕl8VnUk+Q42q,F1,?nEqrc{X=+Cc]r4iQ.<[nF(;*2)l])!QɞUiEqQ/޾x]΂ 9D[3$uÞcBaI5bbMLo w{PEGƓq ڹ~̰jxa3.k]hbJEuD̤yQ P#J S(0T28C2 zHaJ픂sLht.)(hJKkBޮELׯL>C#?YL*vرǯY=-4>B'8Hi WKlbpQIK4`^ 60tϭ܉aV"lAv6aٴ%@ը׎+1r:I8E6Zhy}{ADĬn4)@ y>]A K>6J{GA=;an\Kqwmie8Lb` wSlfD֖`=*=w#P HՓBcu2VFᣦ]AagǑ(ylhfIgUR(DȾ{q > q!OƁ0+S>(QC cN4!]@ў7ٽ8dee_nQ$lG_R &7S4TCHSn2b5`Rа5g)S'_j˩HFfMySٓt|3oۛ6Χ=o8b6p${?o ?9B/J͵ Q,0^.R& =0MFWfT%1ٳ<լyٲA}n=b?t@3&a0N) y7^StTGtSmltQ}yhz,N}P} <BY5U;- {ٲF̦±kk̮\v1 C͝@cZs^ >h5qIUEԍ!O7ݷfnH;R7hZQA>W1'KLl~8u'~xb9s!Qv z7z: .cc}HЬ-TH1%toa f&7@#z6 s?N-/:O׿ӇEF놂 qw6>fL1W9f.%Gw!v">q1Pǁ93{ˍΛx(yƢ{1#(P*C$ 2PԳ/g ??㧗e O8!ٳ xˠD5;2Ggbiv;9s)psgŠɍc0()7+27n΍ۜ凎;Ǭ83EcXk3~_aA)?h\ڧMh[e=Ew$kɌ*ptFVyl 89a[CqͽQCxs.FZk{8,A3ǘa92jI$3r$}>$1nhr5w̉GF>JhAus|au_ y2rHg>MMܔx:%%9Ź3hiU1svr3urc7[d걀a/Wd4>23GjÇ[4\ڴ""~𚢃f1=hz-WTSHw!OFT0!0j{iy"9)W (S 7:hf͍?vP\P\mhar xn@ Es<sgTqX=lT'G<p>/z}}5|@ &{\/o\Wh,4س! W3˽e9Vr*m;=oٰiVCޅp+{տC ?zR$כw p\so߾.fiaf ;՞/An fSQcVM*& q&SHgʚJeWXbeY~~^M\7c­w~<RFe XWmeA#Fە^eܽ@-9'#Y8`5S'RBO@2mIhFMYCI#K%u*&u&IvVV8';&LP"p`@Wx SoѨb+&Oh ٖLmz 9y@вB敕ǘQjJ(T*95gQc*v)vd%P"N7Tׄ1@pZOH͵sBjȓR Ҭ|8໱kBޞ0zPp tsܨ&N777=012W큉a2[tss%f$:~r#F8L?.?6I{-t fW5URl.))ш3qukAߨaBCA $c6ZR|LM^OƸyOn#plw]1W傮`d#>aĀf G +pչ˕bO-h*]*"LP!N`(,z:061 _FhTTZOه5+Kt%+iYznNxkŗ/.w7`Am. 3x5LbfGpv0k=x@LӐlڃY48r WCMwhQjKAԒEC(gwOQ^]xz;b%ߍo zC7ɦXCOũWnKOwKC)$̧nZK Ewq2\*nt 9RZ"VY2h1%JgR)e$Uc$P"~I|]{{7:03"}kos%koOi:G6r/ǫgN+Sukr4D\s1Gz^a``cRPk ]ൕ9CNiF5$IDK4o^>v__fW]U`c5qx%G6LegE-F_q8̣ʪҫT(zdhZ|6V2EmUA(jBa4k ѡXdO%ջuaEw񢯻sĽ1阻\MUs ઔNxэO(OI%C͢!%1XJB^Hԉ2.Fy->BX6KFCOiIqG0 F5zƺuIQ-)!MyAU?nκ߾*m^M6.o>϶cG%U}78` u}/B\% <橤jhBs`+RK2H`';l@U_RyH͕o͍Zإ&]&Јhϖ=5HcC1 dKZ=޶CU}{0IAUW.y4U`\#zkMo|Q`aqUQ!DZקlT<=zSZk}yO|À%ҡ.JN==.y4rjM0 M*+I000U@dV_8?gxK qVL{8HZY8>f"DW}ٝp~W6;MF5愈m=iɑ9+}mgn>y]]hn !!pm=%_t(=LAײ{t-uygQ> {aY6vZWZJ 0b]S~YHiOM1q^ lkm_|c^!?y=9ΣsoO<5!o1=zZ9r_[e1Ů/gF|bn><+gK[͠vx``%{=xz6ub+)7o&Y^}|K+1z_MnJ灃amC Cx\RS|Jxb<?ޮx|C_RJklw|-w'?H[lɯGG62E.Q[>\Jķ$qJGIR'\Ą͘,IhGCJVh@;Ba*IJّ."'ZȎ s'fPKS$fcVДh=]Bm # 8U޹roؗ~q{)oWˬg|"s!QEY~ǦA|J;6=1{9-b SVeݱؼ%!  h x%"zi&`tnq$&枹Xh\=d)d =) o+E"bnD}҉>0*;c)tq5y e(:1JJ Qxif5>\04C~_[c%ѝfÀ(^㿫K ' "dQw_ 3{1IJ60|qɐzUѝO $r19c h]6L|z7uvLY:gץK8:z)3idhHX }n+k73UrLy3^zȽ\ĞB gC}'w,q ѯ>owg" 7qF315kҏ1%6xu.BQ7g{67 %K9J^bD qiT %(i>;Dr4@i26Aͫ( !Bh *NCCm-"Q +'B X(I)3He(G[R  m)pWn~ԯ,]]w@-Zh?&FXF [* sU E"s"';E*z^ P4 c/Fycܪ lp8z9vрzl8Y#aS 7Uug[d)OS/U/Ag_΅6?zU'O`$Urm8 3/_\n^oj[a?oc^,s,6tZ5ڇj;n!uv@MKdd`J=QgY{F+> do8 $"0f3VI_5%ٴHɔ(ْջfwޕ :l qx@-S#S@+&3) y\B2PX˶,6ceFb]2mnJ&Ti1DfȀ*I=xs$N,Sn9ǖ\DI+rcef2S4_SE29- `ºNN=#9g'cxꈝsEPeIj(RpZK=5[OJa`)ҋY'ꈭ6VSUQ-~m9aE"yaB!]e|{Sw[zbѧڰh싻IQCTjЮsDU gTc[6 *=boGsw: 7{6۞ j,7ï s"K&şr-]XĝiFbɦH. s<ˡUD-LɊ ygY\R[R+,̥Q2+ α,'S͈ܰJ_ kKgm( &~ *ɯ`WɽdtU6D5!Qh%Obh+ y7=HhY) Uus%nh*1$tKf r2-(J'S~=6؂MO YhkUH}PFӓaW 0b3>i0yc=?s&-+˒Rһ!$}RK2gkļx(怫wU ;GƒItZkluȖ@ŕ*]n(Z|CW,lV7jY~t1_!+囔^k[{k\Pjk\y=5.ZK!eGRy%7qyd}XE(5)Lr5҆:Qge K2[3f8WI#1idS8.cƱoDZ8}+YD+@lBJVcUT$*pz좕}jV618#Bc.#T2eJbͣ5Ok*cfjg|a`}cqR7Ȥ 69>$BPaTXCs}47{М 5!OIDsthJkz(J JUV Pmo x,_?vCШJ2O3%jykEpj4TW>C!Q;ՎС+ՕQ+TCojwBj)2?,vکr1I:B2-WaLAj *M)dB xK|e7'^8)٠΄Uf K"toOIV(ᅡXiJ~&?3QD{gN{3F{13d4H+5H]iopoՉn-5#7%Z u*dz% k:ܚByr63)VVj Qy5DŅF HLZlNT @6 ](jy=e.Ìn yn*$knJ~‾+NG]3e tuwBˠ'-[CijF7N vZއ װ{E;V^y=ʫ4BXMEf2 1le k Ws-1y5D'\BSsB1ٴdd|w#5MsƔaaEmiF_ps4Cfa@kI ۘfذ& 60Yb dH x8YY$Ίho2:vQK}J ynjA2O; сZa^XTD#tHsTĜ%0.9@@U5716urNItbMmrgUTjNIynûOQbK̼ira(,TȠ\2"l]a3:֠QJhE+Wj6WDɢB#2:0\dR)FcXedYa0:Q-K0-8G `V`m}ɭ PL~)8}"U8B*ke)0E,K&|C-{)i3>d mԈte&ٓB6ޞ£)OUp5:(_q $}^}ӧ7dN["/B+;Hu4'4jArZs8?3BgQ|~Bm ]کU*fmZN d71)$I6Zxd{}m |aIgD߇ds-2S^r~%&Jo2Yf,MndIP!d:V٪7D؊oo(aU%HU Ф=\+MFJ{6G3+#ܫ{?more}QߣGQcؕ1gDM3)HEEeq8g;m3SuΩ: ܌9 Ĉ܁Dbc%KWTB13-]!koTX=$Q$͏ETo 6NZ^  P[߃=v L/uݘ1zs)F|@'F 8i([V4G-pu4,İ hX<sHP@]ʡ`0hW47jW3~~o&_AL.";Hǻ`>o|Z|a2g!?ZU'CzP$~ ']&5cޗ^1ڻB$ivߴ&rŧ$:4\>ny}ƸZSoC6@꧒P6^Aրδ|ݝr0%ԥA܁AR.',KuRYwq?WRnvY'ldi:y$cV-(RP#ƭ]Z9H7ni4b5W4RRLc iڋ>(Y쯹_7M{FP汘S.7Td*!mbE)i6oFCTMqժ6:‡T= qOnt ԧ=-t)MuLkw\lװwxPJ2k2/ia*ίyHz—6OJbBM sAbW.pCMCpgalSlmnoy0`gnXy솆?d7x"?u8(bo$a+1yЂb2AHK e&|*OK=0:Ũ  *[h7^4ǐk QUcBћML1'w]PCf1d(BTCHݼzR-ؕ/?F> ]u|~@S4c06{,\R ϙzsXmÔkEeuCeXQ|pI^9f͔pbH>gH0f+26Π+ s$Kt#tRwDXa` ]̓L?w u73یhh3#}*ȴ˽RMUj9DB cжH=$*^@xΘ/XA6p13+Tuh:stO%"3J6A-<)Z{ƎA0qB6b6 ަA;촱˥ 3F'j%;c@j,)c^GX#$Cr;+VC̆]jǑy5qxnm6opӽo7wJ],tY;:7/~#_y`R.[ |â _XF2o0LVᐤe~IE0L: &߹ľP7DZq+*m%@lX҄DzشDkC#/VLYdT/ !?u9JX̩9<F"  @BPs 577k7\&.IfHUZYj?$/һ4h-j3>'OnWn'Zsr{?wޖ\$xލ}޿^ûsPLO;⣋@Ln$^3(OtB\ü=htL~\`G:Sy9g͗_t8^5M':3=PgoyTHWלf}X)/\af r$ib *Ō|O[A\g7'.soJ=[N Aw=FM;0B=ӵ 3EklcNI^uChf//=JXTH.e,T0iaheWڛp \ZYau qb=uV>E?##oUhЀ٧s3/ LE~}ruD-6Bۚcwe*BL糳._NzGH_t/oZ?TyqAr:Kbt7Cu]UDW̜c,GAyq7o|TrLfX`h%܇"WAC#DS|FN[|л6?oǝnm5}5hګ:mws"gzKlZ _zs#3˨7-Qo@]B++I{e#Rgݲ{Y}ҼʿބxSRd8dHZ_nL\ *T\PusU/ƃRF3R,()g"c1:-gz_m΢Hy9 2K ^gxlG-|iIպ 3c&b]Y张I'Q 礣y7Z\ϗ3~[9)>}Q[14 " Y0b6<þl痶:T|snV{rAFMVsܓKu+lӾ趽۫k~[{r 7z}olBPNʼU05=fsY]MBket.;p7)AB@TZXD~ٛv}Mn(D-OQ[@b)ʙ)fJ=$I4(dh@K> Ch'!Q@;E:J*Eb[EUX7f5i0c+є0<~xBO3K%#:W@d@4tD0ʅkJch!C1@RS~}?I u,"w,n76h~Pp[g)Qq EqTb̂Om;y0lr.Jū`LϕqٿG?4P>G0?͜鵽yՇor`uې󇇟U,H\f A@UIvTL"֩j4?WaBå3%ȕO $cz.Z&S&Fq̅PzjGdOmdyK&FW# bTzJ"X!V:LByX(t[Bj:c$ ˀcH♄ 1v:C$٢":EGF~) IÐl6M_`"@ם~o[N_CNs,3i(AC O%OJ'(jCv-)^][(0͵D9h,O+v'*@ÝJ0D: V!`ZOE9u"ZfqŽ%i UmHcIiLʡAkR^a7*Z(S%+ d@{ :uXfpIk j`T-6v/k;iloA%Wۏ_"l%=uE f&rCVV6dD"$`raU9Jˆ᩿4e(O|l'K3bU6 {âJ]Z`OW~^3njoӘ ;sͮ˻zםw-#UЫ,\>ƏndZq?N&o?!Aww¥x.diw}@w.JIx;jBtOa\~F(uo<!*Bݶ3vf#~ k CM5;u1j0 R s⋋զ+=L/|Ut>3lbLdpͯcu?=Ya0ԅ :jBŊ5t.R<G]/83*rv9 f4gз(O@#kO|7` F ZGXG'HS)@hd$#I5UsB|*j?uEm #(Ȭh\ƂvJE! ۉ||inkv&G=^/S."ъw?vqJ"⬮T96Wk=`i=ӿNkHh[wFߢ-( ;VoK?׏9ԅ1R;&uSvѝ4qSRywc~ZRjfL^{ȅԐ$#:s}֝ʼn#NIE_Kۛ;&6R$Njfz ǃ4:@\~i՘?P4$IBv1:$;ob t*n{f]eh%ط?orjEzpTۿZ)'S3v֊(]F籰-^~|ϒ;S%]?a dN%,I&i@jYm١Cӟq ^)Q4"N֩kmՏ[__ް'>7;M tʖk惗 WXC !%j4IP樝69L@ ::)cG?ݼk$/Gߎwmb#5:վN?MF4Φ ؾ iC켴\o !^-n*KҪj?>ZOi#WKM)r+%%zHs:eDz2c)ibQSu$j?44ҵ\EVC䐣0ygI f\;ΚDBoKJOe;O|$Kk0FuaQb1݃Q"J>0%BʵxkHI8TK"9F=bf .erOJGP3#7"c҉Ēh^@D7ObQPkD~34v])n#ɼHBӑ v'k"t!1vd EY᝾r6Yͷ #{t_m_Հ}~g‚ݴ܏\rwDT6oq-rSv'^ޏh{kPY[a;X#^KH))} K>}$ӕPxbʬ/WB/C<ݔ3)@OTt74V6&o(NnVr#0] qy:%p(WnݫsHy_YA_YkG_ʧ}̯E33l's X !Fn&Gk 0qw[N?a(wëZ4)nO/>0 NMC5qwenmHwW9y,ܑYn'mW&Ni;Yۜ1zڮ;(tJ\O8~ppbUb8Qy"6zU'J>ٸt!v㜐AGmaaFmQذCsffP:k1(]J_Tf%%;h<4rJ*4rJvꠊ+,Z&fkm_E”n&&Hapʖ*N||=CAQMdH-"Ϝ̙s '9e̽c +̝[SmL" j/jT -H@TSrfFMf.N^lh#J9yJr ;# ^f>&.!D%w*xi<]Cr^ÓjHR龳:-Ż쪤T9#혪Ee~%̰] moc$ZziT4bu^Q!8醲ޙ|1 Ŝٿ?һJ:#⾒Zo) cŦt[XvhVa$1>GXih$ƊoO׫UdqGRNHvԅ G >Hm<`ͥx͎Mmuw^W lg3Ngn̩3[/otgaH,mwB a> v}5ݮt7! ϥX`ҝ#"Wa HQ +f[/)€Nb3['=Iu'/k {oYqxɋŴ{ yx?F{s]\g>g^yy}ۯỳ`VL'oG-.|zrzuW_ |KÓi;`.aP/G㦛_exzzz;rNEIhjxpN{BjA%h_5.=<a2Ϸ|neBe;ؐ^)Vx%WT&kDĀz+B/ڠb:IkEg/nr9J֙oMo. ƂCEgG:xLҧdt=NoT4)>M4yf5W/&Iדg y`_5 \|2'/K$x8 k^.oR7=,he/t/19c_KG6|ugӡOOEP_L,"j dB% ̔CI,+]Yjcs,&0L N%!.ti:RNgKVaeO:?1|O~B!ېXJ}xVz>Yjn})FVj 4:l07?!dqV1h> !o# UQ}[Mf?Raa;GpC1<'\ >7v HLJpǥMd[/*Pa0TSΣqy@1vŖGՊ13hQh%NJIZTLT(.F"XXcu$IGb%1ַETiu׵13h  -":bQ'J-۳QY <891kK5f& / $gp v֘# R.&c0w+1q#&* X}_4170n P΁Q[pB9̭Ö F3IΔ-kcVСuXc 3J~v}ȂP̞9jZݱ$TMچVʚ<@\-B/iCy145>D]h%FZ CYL&@Bi WТp_-|-`DC-02ьNQ*`c{i-U#A (&5U1cnpQJK M k3A*0P8g;+@鍯g LbkΕMLXd늄.#,of4f\ovQھ}ML`,)BW4cOp5 X"Y\, k7Z$Ɲ ҁ vϗ^HWf ^ 8d^loq>';sK|0PE;յ5z>*Br&7hLfЃf`3Xh3Z4@Lש%]xGyϦmm\Om"h]y`> NU5ԀpBϰw6 TaUw ꌄ`Jԡ =A@xfs:ۙ>2Baxvk{Uh鋭*mW}c=*#!vheJr fq:{f-ň;aWmtX1q9nޖz] ߟPKZv772R,'|:#0oeKH-$#[H;T/Ǩ&l̄#̽2$vX,X9)0"% Ҟρ{xAmC:w›@bv]<%|=EZ#~ 1 9c8֥N:ui01%3w4o Pl8,sm;աX9:rBzƭP5R<ߝsY@vr1 F^3H$䰖ƈArQl1JX1ˤOljePCu)BҶ}BN3]9f/[\mtVCc--c.#7M/$W 2ٚ͝ έbn]C(c3AbK7a*S5@".0C-i\Ec|M[STQݎfL>wkwkCC*Hr[*n=֔UT;xE S[Vnmh\Ew)[x:=Z{OZub_{3<03sZsüEUAkƂEKw:|gx:\\]۾GDXHb `"1 ʭE,#2ǜQddva(/@ѱbHL*D$IE#P$D$)MPsS: d 2`%tG%ɨAMDƃ8o4WdMQ}7M! $?tfQ:/C_:Y&78Q򃋟qƏ]m%Y/K[F0]3i4OS.T ģI\xC4(6"㉸j@>Ic4oW7]OF,C.D(!IhѦ]9RQ{!0 g׫<9^?)T89Tv%WY25&t8[2/Imȕw+V^ 9@=x -D\c=ֳ OVP䋒'*MS a%H1-WΦAiWn=wGuSdGSӞwqZ4$bUU5'֓UX أl=]8JrZ,}#%ȫ*~XIO |P gCY̭b\'VJ. SHX0d!gW_ƽ}}\4P/Ȓ.gN塎EBqJG=UQzc/1^=1B@ ^*jAZSma5*Qۚ4DWV%*-~X si8c1a23b񎱙$WBPBPVV{ʤwU]+"4x(c2Z6*8-Rf3Dċk1P֊5TMPy`YK \qʓ ]VG 4$T#s օHp^4LaS!QD4Xcj D!bܤrzR%:`=v[\s)o1O82y <80y2;z-$!J*1Y D>crkj~n\+ R~x+"@ EO۾g*`}m+"ʸHDY&)U,&LzO_3a.Xin D={/w;^qBZ.LgF Й ʉm80v<;DG. -F.OBF!ל a˿BhMvZ/×v@(;u',xVQᢊh>JMu#ha#/>xʼn1poW39m穞Үyy?ŢvTd$LТ$zhѣҒ+\c4.:0.ʌEbőBIL( s)Pҩ5%J UTC )9%sP2| =tרtJlT\Rbb86tΜAu3 x]fW]x}wgf&Bg/!`d1x kƕ4|2:%rͦk61@apiК)YyHp נ{@un* SQh5CI/2 鰙CE5!:Ge㍭ TѺU9 "_5PW`90j;S'JJ2T"<(R%$Ҩa4DHHV1щ4jMM"ݬmYSd2att"MSh^sC!{--F{fxfv c 2M@͚ :>1=MahW5 C!VraU儵R:5o;X齓R;{siRCR ˈ!U֐9(ASHIhռ0s+5rq 02QX(eVh!QÀf WE*#p yc$qy ʣar`T(δ`x[]]c-~qL-X c:9Vp35&m2;)Q0nb: p\%/@>jwFbtzIRNzh#"Hhi#鵖d-TbdiGMX(4%Hl #F8ykZ/7 _*uoV\mN:P"ve'yMsvu|Sw7SO?hem/>lcϞ~u\6v(6xu1dݹTo'}zs/dQnI0 + |kj&k*zOs{wL35Y\)q"⑍ĚtO fz'^:Kqx%w]%Ǻ_ =Xfmt4Ƕ l(u+ ՎģkK\: d<ڂyE{?~5]gOп~~qAn c=wNyagնuJ~Zi !c;۹0a9~DZM8AfИG@)/rC4WPぺn{j|H--S3GZ .852'GH5RBi ڐv0SHVCi*!;jNu_/ź&%,Xvqftq1R!F`ba/W9'jŸbJ6ݸJQw-jM rGC P]̕E v.`ihC5XkV $!FjTE_y] XghVamk{ςo͡Ǡ㻛k#V`:}<Ϳ%]skm+}Ow/Hˀak{(GhnU:znnMhwtNAxVzcЦl:Tn/Fo\Dsd Zw Yv FtrߑfSk7L^] _~-vR56 uNWzѐ_9 %AxӏQ!uTN6DkR(U֌{BBZaBu Ek[eTq%2DO<[2YhWXl;ܫWuH^~y붭M>FV'_NNv80",m4'U5W- 5ʬW-!'㠺sŹ"C} Yy0uRzGc>vgy܇(E}XPz "B)a ,~oAv@řfZů>m9oKqj9dF%?B]=4{ fs%[ѻ.8rKܢO B)qTL)VEd$CMrΑ4)Z\N![ݟ"2DO (b&Q3FOȆ1B5BEO 8]< ?t8Z9 cDTM-do'KWPw?Z=t6(1,]RT,OʐIҢ,!t$e)ah61)pJdR2qW7vYzXّNz1k7]l]EKlQ s"g-.\n}j[UyVG'֌Hv4M0=c R1vig .b(1xJc ULj62I ̙̂j&gB˹ ڌ1jΡ̨`*I6O2fN&`rpWO^~6gKwI l]0>c&=fv3FyAf89UDomn:f:S"g񰂚Upխj*N7f;JEAJDgFSlOv|m Aʱ)`㳱<ʟ=8(bؾxk#TBSү2 pUR aƘ>?3PSmkӈEJدjN3/ݴtܬ{ЛجHJhSF(< ^ZJѡ4!zmlrgRSȇ{ ^SEN]{i{ᮏoJ6OQǤNsU79 gFLޮ&bJ,cDҩ|4S=ӛogʳv?{dUϙX;׳ |m^GB eogr)o0q|4eQzLTn*֬Α攫pQ. +E4Є}}h25d.OA“,MMetv=0^ v\ϣw`4A4:̠3SU.Ji& ߼JI-'?IKA/I<o<?y }g+{Yn_#{J+ ZV1 q=l٫rVʍeԨljuGzn&;@5žX$YHpinRGR%%Lu-C,&a2C04Ċ2-sF*fܼ!<`2nSa^ m)۩jzi}8*2<%a\ ѥCr'E4FJD& #"u_wi{aUB <0}GE 'K kHϻ!y\*$T0yj桖(rııł ;"HP0.:K]}N5r\Poh6q4-s 8#`&$Qc8Fqk48DZ[ #Ikx M k|G /l鋍U JbNӵE(vA؈W*)f, XhSEk sk6U T>0W> "ص*N>$(ry%j`'0 *gKKƀO\jpj\ BkTHN1cSY^%4ɹR#bXqyE!,P,](s~>mZRڮ FG++ct N/~;kpCT#r~.0Z7gC3>wa^wg\]TZ8bu{Pnb`X vC"R3{6ۤ{+*K ;Qj sM.V#sMO--ϩa,`ܽ<O餍:Iv8@þ_إѤ7XX&B"A#".)p47(G(QIV]t ^xz 쎅U+oRs3X6H삌L߻ۧ=[b2D17vԅZa68M]n '/S`)HOs[p$O{}izP,f Ov!z& ޑ֜x3`:@Oa2h\'4%٧]k򳛚11V&  :?0?oln뵋 #wX#e^w[{GEly缫k}xbz/Zj}s}{ cH7n=|$%ϕlg; 7`HOi/`$F;t59=yo]Sa#"#RrG`DHRafX4Vc8TNPΝ GCS[I-+f4N?[t=Uw)\ y3Q!* U~2sC.n>d`CGMMz.)K;\ilk,>{93tC0K^3c[:8JGD),06'wԫ?%V>+$9ٻynB2c6SO5#,zew)isF{&ƺ3q/xЋ o KV '3x7' F4XWa ch c㮯';2P9>NGVEvyȮAVj=ҫov̝ܤȼ(~V_m& 9}֒f>!}ʴssX!iȻb1pK0[V3%̨sx@+OoF@WØf.scpB0&V0 PPIeB3bњbx8k"oT /[Taa _ .'H&N"3 l2 cX< ~MOK"xf}seW{L/g_Q ]nG.IJ8]̺Ttu1]}_/ s-ʪLwhkZbmeAW(.ay|Br`Smx7.V2S}&SY;w,hYV|&Zɦ9Ɇw3z1u7)c{ʂnuXWnlyR.6,&>wAөZ>|Q^̻UwBrdS]wSKnU1(c:UǻoT.v,hYV|&ZŦHlz7/bݭ*eLjxso)OVYвޭ M`?)元K_; URAsw$ S~c]v%Ioހ+h2~O]="N yB2pnD;dLVL֑BLa``'sfz)?Є}q8M]~yꛁxGsc@<(XZsE̲[Wy߽ =njwOu\k>B0V& B'Doo{_[/u>85zb7qsvd[nkOHw-vyw/Fz__&cujo÷M_Aﵭ]_x,M4>+g h!\tuLHi"gfPH+ srrLצ]fO\D7) ԲnFh`t`$ӷ KmD4mDp%m }K@Ujdj)_q&or덛\Sn}v.g{xx7 ?dK8:CgݛnSSIr7jR.c17Q;m=)i!ׄM 3}D!hE#`<'I\R]CI];@eDVZs+*y%qD!ԠE^>iKWT&„z&ÄDŽR~d-l539uʁLg4s09),W X,kx9h|l~>GA: LuI-^LHdmx)q2C"%0@޵8ncbݼ~Xt@3 =%U˷TU] /)eٖmR$AKws͇*_ LH1Ln-"ykFS45TIK D2clߖc~7w`Kޏ6< + wa;O:HMcqf%Vhh|iZm1x⏈6(C41(KזL$Tٸ\ńHm>n~0F)7ZZ)%Ld$ h*#Щ-T˅~Ϋ{1o>"ҷlz=蹫3@p".>'26m"[`~FvxO<_7 IS%prA~SÇ:yo Og $:֜ &Y"$Vb BgQ2|S(g2X Ҟm]:G?/%e55+~6]aw@-9EAFTJ2Y G*1Ip#dSJN EFHwt<˗HA$&:Y (H (e2M☋$dQ'{00G5ĸo"Di8ĸ|L}0q&4B.|X~Iι%D')I"$,PFH -yg#YjU^cN1K0 |[qQc'pQ@5M04M@c3Ƙ$qL*2G8M9)(%/w*c5@ IBc&6+< MSs<2XfҌd&1!#VpG8t\0D+R6!4w/~^mE#,f Rw&DJl xP;1=j+^K뒚ũt-$ꅯ#j5I)Bȩ'Ҿesʇ(`J$r|A9 ^a ~19*6\u~ k<˨ϗf{-/|eR,2b|l8]X(AP,S`c AvPvN9,I$66÷ޯ?I줳"VÚ׆eTorv4Za0.E'<1nB-Z,ZqfMoQS"gAPGV改Ґ jf[{X-&B@ݴ\走mz}WHAH8Qf8ʸ+H0 :khj VE5ݳ1YQLi v85ketqdbN˸\,8Yt|V|kOayZ>/V˛_=g&?4*~bYGqL9B+ D J~k*^v=n(Epu4}u JҝdR0qCW!<-Ӣh RǔVJ&_4Rޭ2[wo{(!f9 iʵQ:[ jxqL9p,p~y]eWYZ9\jHX  OR Ok;U_I g98+WVL% ^;!6un m ^Hh  }bc5 {Ζ8)}9Yih]"^,~IaKi%R dnk5'@ LK|[tBMǖ&N3ĕ8Tb<"՜G#N6Ah&9܍%/9FfW5Kyr9l.AJ jnỡǂ!}b<|tim,ev߁2(AYCVSevk "TRCc]|]&|)߅[BBq'0qW.g$QyI ;dt>n?15ܻcb{vnʉg@ c]s_NaӞ"?{˗,@AƦln/ Zh\|YM{*(zeuOTFbaMw"*wFP xWqɄLԸG[Jv{p*Y˳`qe\(mVˀ,9` n6gĀ!))*=xgt'@".9 jN"ɘ_,@ 0#[0I áʁɖLB7!:<|ޞ0ςR!ruz?x'?m*Oh]%jp.&*DB-d!.g{ y¸'ϖUi!l$ɣ(mF &R8K s(_LdŽ璭ůs;4*5 ϛ6wܼyf wYd!,P{n+.(*%Ls`!4~A[X[~5{xK;j~oy`Brf]BnV6O*;ls\G5(U󋗉x@f֗@i▎QN+7@ yf6cǜ+G/^0t!\b虝 S@1hN"cR[-93Ҭ`BP~T,gၢzE["IyC(#Z'%c4/EQqy-now䊓k!jGQP0Dq}'KN:5K'9j+ SWd{fRAƚ?WrIS$g\jTx$ 29DА|ΞtG !ӯ{3I:Ãpȫ V!M]$H'X// @0A]O? nTE`Rj.%yuHSPQ4z]e:oO"Bt?69jمA;,-0ѿ8 E6n lz`=ǭV\>O;P!1a_ΝqAݱR"G\6?!*{5r;|&A/Nyƈ  óyFg/v- |xN CfMkcƃ}:@º[5cn\]HHJyaMG42da6!~%8\P R"Pӥ^ bS9 e@sߟ1OI:x)lmVLSlZP*w6>0mFw{ s"4lNPF qU|{^Q(h0Fy,;_ TJi5>oIX@-_}sA%Θgy0Lf89C*Ġ3F©m*8/cҗ}TRPT,)Gi:Ha3A.5QW^"){ =7jwԁ$? _L822IړI69W75+2Ml`/:V1"ĨURpFa2PP˸ IЉ6]㦗[`=$ݷ^gQ4$ xWjΝiOk ީwPcʒ<_쨲)acBˠiml#sA:οa|bX+Fe!/o_5Mwk8^!:,PKM^3+4/T-\$2Z\|.WeJ Zľ{y(y\ŷ}d ջdzm<¦c.gد>[Å sT껟suNƱQCņ}>_x3ZM~nvۊ?_~뀕#=W*ԁO1m$<@{TbZ BH D4 ;\=oWi/pABu0D)F\ŵZ-dѬ wȂ1wq lsIqaP! ҏ:r.;^qbۿx(N[!5Y^uO$WP3 V,D4z˻Q45j _/,Y!)1@%HLjixtȊonNwQ1Dq$b8c_Y6 eiSM\q'T\d%%)'TiPP$--'آ DSRQ0 b:/ X ?B?1=/Uٸz=]aݙ,b-cO'><9مh_<ɣ/8Cj;(6 ~2ܜ J9ѕ!;*=x^|R;[hr[R,,kz/+Nn+k^cQ=hހ\'7ja>x􏷷W/73wȐZ^U'[g/jZj̇/k ZMEGnzட|w1/[vgIoU%E0 5-1fo_FkɛZpׁ̟wǺ7wzXu ,ɧM6Z|1z]?!QoRO3ճNMA&!5Ԝ w*iQTլ6[w]C77˥_7ʝJx7mmܖkpI-Ѡr~]NnG10^rAg>G2g4[V°V0z^#v - Vu /yú@`mylu6C7^6 j{vmxvj pn̂,AE5%mO ciǽ/  ξ=QhjD:O靍LP3w:~2wܙځ ]6/~{V$ZX0QngG0g/)Gx˞]x\/lx+bDjE?e}j9z-5{j*Q} UF-1S`NqP#='\L'j/Fwvףy%+:^&x4@Bɹ(T㐖RX 9)BAMVZvCWݤ.\W<1 Ϭ?$$FTD_F+0>*AmM`vA9iQBqB8HsTLr!BPXIek eG[.lH]k0XS%Swq XmSԅN4@,fFDS8\JLJVZ |4% ɥHx%w`Cm{$TnI22{C#<`Ioİ• Be(eUK`Wփm7 Қ"8R5!H2By迷tEo-õÕq_,W-uܿM'ki8B  R0(AE ^0B8lh%1+8Cd7'n7~>2{ΏM54PCU,E4*S1OoϦ6(g[RF%@$a|I6Gz&p^TSgjuɒ;[pa[bHz޸N͠LP }m= ߞ&a3_tyr7X?B9&+.{s=Q]=.sӗ!ج{E?u%R򔪷7rK?׺UOYf8V@XD [/r +evz8)7rvsTt!kuf~AZHRؒ*2c$'íq+3P?%nfQ)]Ϊ4ix;> 8\B}Jus2|j(nT=F?O]bH;Z.65jv!;{;|v L3'banN;̠\$#V륲lN ;dxcsr'q#KJGwݙzw"nEݤw]5KHBik?YSx,{oh񅶒d?ml& m%=h*ÎwW]@5_?^]j=3 %l;N'/K'bteZ%x`\R3 okrJ!<\;1p Ig^vl)$݆14 k8O lV>1!(`)I)6T CFzꍔ.U (xִVQ&ǢYު(eZ( 5Vt`kJa YrH QLp[%d \kʙ= KBKGrcC3Ԏ-ٳoJ~Vlf#FCvOl4sF7~zèkJ3qo'1x%ե=Rb`JgBYz.ha0SBsʞu+&sF&]gbwݓ"Ϳ031On!Y*N`Ah [zCMp ȡ)X1I40Q6hubdB7u7הq`$+AN]*֔[InX`9ܠ7܆vkިyެDy~?kK]]6/WfYAne/u`͇G :mH]Upq8+.u 6l~?U)r]`UtF]}$PR~P嬦_; yTR^-{^suf8r;;; /]08[ C9K8EZc+ӄ@% #Kp2)n~yP˸d+r(*WBwQEFwTvopXw`ᴤq/Y`0hq K+,1, T 6E9[h:s;wbrPf͂%y#R<^ࡵ: #p2|7q~!,s1HYdc}Ei#aj׋Ev ynY uͷix&[.)mcvu|iB-[TC9E8֍!"`r1HNnXm;Nqںe j`3hW:zͥ5GkS%Nheah%Nmy 2z hz6bk}N1K&-t!&`Za/Pi.Bg$(L !֍xc]db0lczEq";0?u /Y§M)ohCx*E§%)ohCx6g c§a8Qxb/c6i8 ؠ-Veŧ`T)}3|BT9YCn:)WavqL2([}{tsOh$!@ }C;0nɃ&#:aF'L^()R9q2\_)\G:t(沼c7w+*b/a83 wU)jɌ.a4se&8LĂBL[ ,.JUBP,936x/ONJ] Sb)0NiiIІzN$`+ޕ57nc뿢K*32U~HuҕN'/qqm%HTǝ[/@6$Q(K+֑8s 1Ei )R풰8U97xCĎH tZ;idǾHpmg۪1f_<Ԙ(@4)LB ;6ϑbH)"!qRf$ez$8 KH O?H^&O̊|/fñv0X@>n;6ƞgW`αCJg@umS۪6wm+. Ja&H "#PŜ  F)ҫOzyNż݀d}tw 4><8ne3W|#j7ks=( j_ì+` &Z iݹ4.'9|AgX:f9O!C>rWX -=8SrnəR/Y+s׽32UB M8ָ"W?g3S&G&F3F`5>5 n񜦂q%X v;Kn xOpfiee~sj-kn\.DLq dj1}PEk\(yy̤]P7kahMl$Unl9b#M$B$ ]@SAL:.>.ۢu*%ĭ=eEKXM\+mчw*ՠ C׵p ▮!1GPlzP"(Hv 9DB@o'X ̽'XWV԰Ak+I8@y.ԗWH_)T*Qi`Ɔ]*oCX[ϰ0܅] GB$0+[/~l))A>3+6Y}0L u~x̬Dqup@}u65])#b鶧# β1 =,w`i$Q~nUM?خLTMfL{ƤBSۻH'8 _fqX8;5͜&O.\A.QANthU: )w=ss|^tj֌:OP`䙺" $dg\Z>N wע J̛heU|0Dr[o.ڝq@~|Fhk$l#1;LՉt7[DᬖS%sG~RiɊB>K)b``9jP"2UCtX$V!B${tTP@HU݇C)`Hp D $!aY=g-c2W)l] oKR8+eb/T8qθgۑʗ 5onsϼ%&?)ikp҆A+ȿaڃq~?%Ƚői?%8:9r2;geBQw |  3xE Ơ\ʺr8鏞G'0a a68 %ϰQ#^ذp/cxWs0_3~eK`gJ,0fA3*'P׫ӎ!\ xX{;\\(+̉on8 j@o֓fDӃyf*׿pA,G>ÖQ2\Egf7Oj6S_zdW۹OM[׈wyB}D /-Y =`Ii@9oV׏ +G DeVtV"mEf3V`ۯ\ '4Vu0_㓳pk[,gvWEHD$ i H*U0J„b I) 5 1P!znҲNNYQ݉Gf="Yٴ~s{eM4Mړ XksqJJ|(^& eԤmb#3L.ń詔a"TIDci@!ZTcP(&b8`CDּh"#O a2,Q$V(haU+H}=.wy=fܵ4P:JKnNPXji<"v5.XPE[ֵu%;ͱ\6[sz5{SF(pUDѣdSxOӣoP\ 0A߿C>}?FtՏSԲa4B@5㛱v#Ef6jfAwf:t>]߻b6oX66^߱@mdq'6<c&rOG: m,b.`b"I z3c b!I) c8ƌA ȁV1xr͸:(IMc%ST-Y.D-iRhBٛ`%f䙂( `qqhS 1iTx'!7@X.ACky/AU,h~]1`U o=G8P?xv_0HCβ38W8,uS8f^O Gq%x!=z-Y2 9q_ݥ>zq-czGbcji}.1vCF3EcAbXY.q긽;J=d-drEG.'8zbs~X*n]p/-6:ۛ۞z2z?MϏPuV3W*:*lnEqgPf^cD !j롶KSEǝFxS X yݾ+ĿJA7JK[曗~휽jMnIގVX?"w_%Dڽ Z^eV ٢׫:0jv)j9WoBAe9D.{֋%kd%=ܾUH}/jKCc (Q^VYPnn{'i?^7<`X(DIBDM CJ0) Cz2Q'L c%4s®ReKd͠_t?s$=C Synp9E#Ox95Ŏ3X?xGWfp=P_ Tt_|aֽk$$Š,@d u_ ~k/8"xnvy‘H0*Dvؔ4޽sl ZY=]d~[#ʷFo(ڢldiR HF!ZOsOU?Owxk&d l1glv$xH$@="c̝l{Fkk187;*ℕ bԁ4ħ4*I*Ք3QrY92øEs,PfA˲vLyqI.BFΛr%8%9#P[185'/ Y4#Mݦ$AӦj8.<l.8AG` ^_DCz&O7f)6h u̠DXM@4x@smS;[Pj5j  q5LN%`Ք-IX+wLf Lpʖ3sdpGq׼T\+6lhA!o-q|5.YKGgQ $)4g#چy g6ǭ))#7&!A"$1OSN=x+s2A߬DhjZlz"4s/aw)b?ܣ`'僅:z7L;n-e QKl}U4r< _ aO% cZ#l\e[ )p'=H34cDq&Vm1 Clά(z}i<:tmMDͯGj5?3pmzC_kdžEV̳R0mY3ostZlqԩ{.2:BhnBeR_|]s7WXU!gpKKmm*} O,)~!%cb83p\)5u 4G}.+̷[jv ʼt:aQ,CYnNޝۋuXZ=c4;iqKi+QO[Pci=MYNF#@NbOhge)>H/uu2;/q-"dG/| ЅŒ$eEY[W)ω-xH8]Hˤ\d;V7R %xrsg! " %)K㢐`9eߒ.hcK"Akʹh?;@(Ip+]Yy˙$"H&b-h'- $\nQFKC;UISE#l7zTUa럮rk |6EFrjø]~m祝b 9j#V#~v}7fU},e0@oY5F1| A)Lr=#uo "Kz:\d"+uJk\D# ^r" &$c .Zf[iuS+mvc.9ؿbHR_U6IQT _31EW/;@Te6^| .}Is pKdT8 J1x D7igL8_p[AP 7<"9 ў27;40UsfM1Tc0 I$>x܀ C!hO? Z2?x1@c%XHm%}hކm\ Fػ7^G<7fТDx&btQv'=i܅vlQ-馃ꘉL΃f  |٭|8 `le%ڗl?D*tgO-z(6XW7lTl^=ph%8C81Qر` idžYDbzϥNNZ IĐ-}tsw#F1+ȇgꘆF>U=}|/T G!hVFQA&Dt/Y) ǁ" L?v3$o3榅G'hc* d!t=kgsc6_9?: ]M-I]8 MF%f= 6PpŹT}[NHO[쉎$BV0!.2im@2 RN8|BnzSSN`6Ut[*<֟>d;v_?ڛ>=>Kwe Zʇv3i`n!56WJڷ*/zʠ2`];-9pm=etэؗ voWw{2?p -&`V}lzu9v}˻ӫ }5Z>> B-CmQ־tti%-[`io]8PU2w=k*#˦8\iQzvKGuG4dX|P pC^ ZnV;w @HSCN#v];,voTWpsC6Y3գtHV:!ibP޺`C[񆛘Bu#VK&V!~tZZC#ouNOJ*oQa>;yWe!.W)5_Uwy:濺\5y3v_ zg_Pu һ;{yoQ>:ƂЛpop OƠe.Bًjf_sb:O}Ձ0κjvw\6{=(Z!/\E)ڄтEiZQ?̈Bfk4Uoω)x-a,6|ɉֵ,v_Pt7(~Y˽*)t{6hcHSV8"/Vrq=\bVb?Ah Mpekۉ=EhI‘ru_W7^ήw_ӿ:BQ؄ 1zk;pp9!Ju6Z[[kn1c]2X]3"lc0ǁ%>E>1Aw) ƕV V~U1p4vV{%1Kv28*D$F}Iy"6"eˏi`St#CEH<.rtAXHVD爷 n#oQ0L;-Hn/qNP_G +*I &3RD YV{݄/w7͇Gaz?=;dgSwzYȠ0) Г> 9ԈqY/_dkZ#q*xWV.iA\?xH]^X@#(BDyu4qU=Wz2T}}۸Blږe`4BE &CE.zUE` TU* eL:?1n@}wig n3x򀋲~J!o쌪0tR)TF+.p5w<7 &`ֿR쩜s 5BڗS~EՇh/>ȷ#zaayHpdA! B 86w\L!fZLlu^iCޞxOa:2Fe2͖Do.aw3j[90Ep6IgUh42цFT=Ob]Wq +E)mp*܆xJ8Tx@{{Dgc4ULyLUTPnt- ɱjӫya7RǾ`<} WۛKõo޳f駘!Z`vra_Ͼѻ͛sԄ|15-0ݗ<1Ncw>=dAhmFP}I4+gyT4Rt-giLC,?\>HPrɝUgUmJ0\ezB B$7E<́癲{,d(ŗ!5Ɯً?jsL-5:闗p/[ŊEflÉX:+Zo ňXauKMz.ZzL h:9vQL PD75.[C, H9G$)ťZDhs=;cͽGD Q؈R[JhҕHU*u $ǝў&IMؐ6m}b[{-\*)ey!\ONn1J۲" WC13vlc6[6Z[AZec lxlj$N8)e.Ă {&^ jYCl,Q@pLx>?ߗ??tv^J]'-Vz0]V\ib~K*O,z WǖH+3 c4b8~0{۞^ 2}|?fmii:ɇy6F?Moq8O~ RatНccwmMnrNvc/CqNm&p5FH;ޭӠ.CiH  Щk4@w ҨaN <1$S=?lcАpBjky\xҀN,^t0Eg'.BEjJlC"WwW W2Th׽&aҝXލ>U˰5SdVsZƳnuxuO(u|ۦt甖wgj1J0gcʣbsx~ !+nϯPˤkc2y"ݓ=fj.ŌK|f6>~aiΉ-) `J|:Yc=͟*(,J5[pHZ9|J/ÛP0}Uѱ|0M/k:_iSoY;xu@:p ecXh~V`fF*2ɈTib3MP1 # ,IHQEShŵuI0V!4kX6FfL#0jĚ~wy1c^|6Ϧmy?bF;{.m'2%)=L-3Xe\zpYSY}jz<GaXBO& λ2͇ռj^YtC %% Ayy-h,ήIRA>}E2dSyr7YJ`^lϫ72AS6]8 ] M1 c40/|`2{Er.K*.c,uAx1K@YcZh.:8\ПJ˄dД# 'iN8L,ÜYjH3yq֞@fHH1N@ӔyLU!,(xгF 0{M<ה.g dk ǩ51¶gFkP~Zs^nRAs^#S3qL9`e<50rɐƆ)l)cƐKFm u٦q&b+2ߕ!Ӫ7SW_]~liVY(6yx}o{ [G#)(Z`n阹֩;{,Sm6^`Rۻ\Zz Yw0mwJu2z\rV# h0 UF]4+ D|O;Ua.Mk=qG|:_P5EƇa$u5F}Ov pp$%qM᝻.us1WLZ q,6*M$wh(zvנ :Nѵ)J 4X ̸Rњ}٨r#&pBpi4:DBC50JӨ.'Zom8fZ'BqXőFP=^UKrR_rUx lQc##?[I8Ȋ,Y&5W(h P`ͧM/l+<9{)hD,,F (7*CR'~qe*Lj1sw5IMk#$R|`xE6b F7N# L:QqI90iYT09)2Nh?&867ABu [w=O>ߺ,nx]\9.SBnW˗a =Eސ7_`V|,zqе/_DDHT4,ꪜ^-W-%?#ODߞ}WJbYzǟNƆPEauыb9TUR݆H&ǽf g ".7{Qgmiy8ΰ_žrul„sRF,2ZYSiP%̲J$gOp-$F-ֹD3M3@$U$?ePRȌq1 [+c8B2I,;@E 4NH!}^=V)<0K:χc ("U6u¯6U@;'+d: E 9ZR#C`La)|RNMrסԛ?'Khp ̜2(@DjZCF=$JkDTudP4 % -33~,'VX _olpq=^M]Km~ȪV_1*lcd2 {H2"3xd2Bply#O^_LOgJ)BT 1`i6P`&`Ai"`:t]G׍|-p,\-1&baE|l3'?&p;1ubv։mQ̊M <KCsW_@CK öfm&6ƻz.LڤQ_%1ΏsG;ӘH/0&s}n}˧z% B[4^#eb)m.*]?7cᾦ%(?B!B,97čW=/!@JNꃌFxJ ǃ హ+ʅC(a(&m0=-ˆSBUa=|.jM> &#F@Lf# %Z#쀖F qRv, Z8ˀhboŇ*1> S L촵SD?3zh -L1zXq10r_3>{(8Yi6yVR טbz:$4Os%?,VYFi%i#X:ӹky=Q|l>ҲbҌIht|h'fZw6zOm Q݃%M[oA+e0p̗9ws8G%@h`xޙ&97̽7ddXtGM4BO!l}ՃX/e vQ9 0 gBACN-cA{u6AІZ7u۵epVtICYoq<| 3|KJ;.60h8ƾAP/瘭ff!V7gM <^mg3{͸, {W9=Ep᥄aхq%:';dT6sa0x]I[H0!F\}ڷ UzXc>ȣH!;a?ݾT?#Jv]y0[F?%d[zIlƩbn=QN+{M`+ԟZ6̓϶&_?:hsѷCRCJꎊuQ}W%,XC/D ,WקM~M/x Eֆm%w;ݣ2\w#NҒ}ӒnsI\ee'dvez95ރFմ0^sf_|^w,lΉ,FF 1 (T+i ϸ0(y4@ȧըüͻᲀ6/2ڮUw4r"椻UC;1>b"UrqwiU2!YwL\ŋ+WɋFrI6/Jʲ\#z̮ʫuIJ/bΤJ$| xJvX_ʈwLsZgf0?WGxWQqA,?#%W S_T&J^T:KMxS5sΥ R|]%OTuooX>x <'yJN<(;%$;UY+Llʓl<5F8tե_'0Y_I=?g3 Xc;OXNSC4`% 30=,2 dLs |i)ЃE;s7CB!PKPV<(@%I 5UwPp(i NIBP{jahOT˔AAᐴ54AaQiIB@V|-Iq0`f\VgUSs9 z\JxExv|M?^y@51|H#ceO rK? D\ Lt 2ƐS6[}I("YAIhh\=E;b1L.,9K!56eHhSП, P߼M?ݗO=(cz=(eͧf*y/EjxQvtTr[͌eG/_ !yKx-?='AJgܷ^atZ:\O@.ڶsH|h $*ʘB2֘ձ.)Pd0&}'!DD FZ,`^ }BšXLVfn9nO>>@d,3&y> ;37\x0HI 16d`1I??o??~O_I./]j6M(~X0YZ>(u:oŷE2]Ds0KhC(VIwmm PEA'@Txm[=đd#DYaH̰ꪮiT{6LJ1o%X&9-l5w0O|O`f Uvt\^jQ0[@eo,>Et E1ٝn*Xi\Ʈ8eg?jpu~pKu68j]9&>gjσ J_"5}dC  0D ( dS"5q  ,s"h)n ,rj4"GB%>,26ZAFcc,'f5 E#PF #SIyAO$G%0eS#cn͕TR0m%F0" n 9!L@3]aftw&蔇h \IjbCL"P/ryDa…5(8aAA&TBnz4ٖS*m_Y:mz[3s/gLL2iLxX2U0b.K|kn F_2c!b8](ŵp.HK('<=j[Nu̱6f$ c,=ی"ucoyQmʽB)|%ނ:cogYcǨU`5wL<\DKR0HŎ`/&_ŭkt~S/_L1:TJ I#/Wjq587! 2(*aPN3;6='g\8\ uHjG;4[%)b0+[!<^p4(Q r,i&+H xeTz/xũo]+IZ4pZl^I8⠰_^+# JV ~8sV)T=itv/J)@,=Y1xb > ?'x+ʷđۑ9k}VbnIebLgcpcA PUMnzeIRXJf=;̛چmg@1yhG*;ߝjMYe{루wcIȎ{³ݩRraT3++=wi/gbQkv5.oqG;q%Zq֟viRW셀ݩւ\v38{|N5NvaT 6g;S-)#gƄ<8ʋ z\|vYW8c6^)_):8)~-Kg>,|vm 6oᴸu' })O:NYX \, k!i>&v?xQ.k!,LUľ&W[n<>cڋ7LY yMBYQNºtrxuko]*ɴ" Y֭ 8D0k :X7*5-Z\ԁNnn_'*d-ʭuk!LUc9֍UlѺtrxuk/ϕdJw˯ciݚ@ȁC4 Sbs|mѺtrxuk/o 2U瘛hrL>ܰ&0;c9s}QMv.nj &19&5;cƄ swEm1nj19&5"չ3f >瘛8\scnT8&{9fN%s}QMJu/̹V}17 t/}17 JR>>ܨ&k>`9>ܨ&0ʻcR>PrwNs}MHu0,)19f5cƒsswݩfYjscnT41+{AkGmX>瘿 &HS+"J8u/gO/Ͼ~\0Mgpv=Fj˰i8 '/s%Ȝ" 5d a `60 7b ;H"hN\ˑ IJ NAi#-X[üMO[-5O9,9tY̓I6MiU`ͱ c&TZ ,Լ,wK=\.EУ)8BGXERڐBDYh%2ڙs>"4~iN6D JH@oR1h20Hϋ!\v@6"‘Z2(NRB{ Yedae egY(`S)!bb CL(AS!-"Q>23Hdu*'_LIꪘͯ.ʿ{%;$5zC 2xć&Q [/ 10uϗsex}yYg]Q뗯n9%iknӛ&^ҭ_`Ǘd@CDIJDxˋDr?&xO!<,ߙtنa?Sx>Ji@o'xA,ūmްa3%Nj*7j[p0jKk:݇;g4Q ^0Y{b2ÃEXZ>Ts)H`[־!E?KKao+4*vPjHJ_djҁk*֋Y\$ ٫1 g1-y]p˶J{Ň(η__~f#B*9$Dͨ6C%&: 0V=믏o: xx/DM}dY |d 7OXan# 9sƥ[ϸpde gK_KkDEȐFvqEQ0`~E;*-G`..=;aiJޫ Yz`ϋ*ߞj\R8OUfĸ0* _qq#ET-CQpӣņEN('X ~r<)k`4?Ίz;p™ia:rTSW9lBfTa@90 ;0s7M7f~ q|6|\&_U57[/sp ӳ7 jm&ƳQZ~|=GPSI "1K O$rnyo,i4USj٣~:Z0DаhoPpˉ83p>Yd'/ы¾"z>+9Ws? ? OO+IͶˌ:u"hZ٪Hb囫bc|/V%qܻ G0 HwG Js_G*fTt~!{Xm\ri&ORuyY̏QQ]!Hc=ZL00Avxco<)yR Ue41p); -"WT:na:c[RPq8OVvt\^`ĐrE_?ٝCyr$ĴR&17SNJG-Q!h/bD!".mlN` |Akuc6W̮Y)=+婥72vES] *ٻGn$WzbR#x5/݁. ɫK,RUJJʣTm zܶJ {q6]R̿-i߲rٙ "%T0K #wk5 (*|B(%sBkқ#.G%{FA4[k7-3Bȱb܀y)aS8.xjqҫ]:'<Ȇ3 9p)PF:tDx[$#l1SBIF?oarA`x0 ÿ\ 7佯#̐dqAUբ;95g B9}w~ al]pKJz4s_"w_nn6~lw,-g|?}>M\_G-58=:0]Cm @J,ti5iuگ|2Xh4I?icvtmMd׌FxZk X3J# I%2%qW IÇ-?JHZ)|]sq$0u:.Fas|fI|Ee(y*c4}t h.]kVՁɋp<+?>Nf7hӇ6h^vv7w B8K*1oR[;};Ze aቄ`_~V}'+.8|@VeD"N>ugRu WݴU[s.vEw)ݟ}떈<R>6}xw~88})/c/Db(8-RFZ3ڳmYi6v$ t<<ɵj9)T*3ǔj ~4Θxy}2G>RF G1SqYI4$]Sb*%"J1VFKƿ(^FC? phf_:!0pIu9Q6 Z@q }5ln]qC6Ɖ]q.i}mCFuL6\} ̎(0c|-e({:+T N7_'S>(ZrWJScMSamЪAEGX`[T[pa)>J(‚H|:qBhR6MxV>sRbLkqRZtjrVBJPI?{y{RZgAzW?:Fdh!9Ç!O!l562+eLDwE/dL̏]F"]^@ Dr zBmlƮL[%ti.+taΒL ӟ}C{Rfm:HY+S::"D3rת۬ZPb7X ĸP'ШR) K<ꨡO\K]*?6]Z X;p%W8}&v6/RR˹ ) Uj^Р10)8)_ijƫ{G1RUۂ%éjV| _Q&}|Ӕ`hH!CG^hehԩh_(,vG6o~5WAy5v<7feԎ652Tሻ4+fw4T_'#o{bcͯi@cP9SsȎU"U΁5k*seD+a+l`N>=rbU0ʫ0_|s5hb(:fd&IzXMj$۵>NHJ:UoF#D1" Yۏ>W x{DZIA`*ztQ(3(\-./^.*:ZQzÆd5~~Y,aϜͳ=e&VCkS4d!>m6*4hP_?ARoZٹoL/%iGJ)]hh6_19cU5\_ظ&cW1!Sp]Zy8,?4tH sD8Єic3AJMia\/Oaf¦CmSԜ I6W!G8E?djr u2xКux¿zsB7+JEN<0cNNkcY6!BS=qZh@!TsBRe5"2 C]͉i*ޕg.S)+Ք*w 7Fǣ g# ޣQXOEe4y2M|hC\@6L\R/RjEJ;H\ 8(9 e iҖ9 [)>E!"&QduۆZuk@|]/pZSiIo7!JbDƄ\$"=UjBR Aur  8@R!$78Ko"p[*/S&mCLHh-@oW]hb-%)DFJR[R !#@\KP%(ŵVZV GDD~JZkhvW zEU B*g)c0ɗ5ZeU n1Vz6Eڼ3r[G2xm6HJJYx 9h[O {z]9M=9]y iJ9bgDțşoj/ۣ vy+ In~ x c;QZۣ3.b6uKW.I`cN(9 i'pOQRh1+mP[>EvD|h3vt1,8K۳vGtT{l-QLy9HdђVKg%fA[GDБ6R2ߌƢ{S+/  vCbGTYzIPѤ\q8h1e&EOר~DQ-GYFCbDdiK#7C왪v85aާmN}ߌBk&ئ_[@πwgu0V 7_'ljvN }D30jЧB-/fxvH hk8nY=66 ]KNδY *aI7=%(C髓=!p^ Z#'.o,{+lީ)%i =E+2wÑ*]45qELh*UԄEi y0PPdɹn.+lki('lʠʔdBe"!轠3L tQҋ4a)w<Ң3 vui$Y@wc[T#Ur1BQcbL2nLG<GN@Zq[38fj{'ђM7er2t )TonAZ`+l(vGu5Lxlj>y~?So.Cy5:d1缓?lT^iyf@Gq٘<$@c^:w\ 3G_ gK.5*k3dS.騺u9wW 祱JK! Pˡ>R-U0īRL@3}r//҈k邢~pgg>j^tŧϗE+7;5(*g޹p}7ZQD䴈L)eKzѭl-h6(4 e ^ڵ j)p[s>Wk'b׉i'cW;,I xV_'{;{Ft](dgdʒO? N5Ts~_FjT~ I2sTrL9j]j#K5CO^2Y:ɥr>wIPh(5P:Dp[ ҠOʲ0\~i/LI<^8+hFDS2*ık:,C7g/ 3O\ԛZx^(ڠ)5*,ǒ(* TՆVy\;P"dczU\j TCG¡Xb:8|~ToR'= -/GW꟞Z (ʢG+UE@혀ƱD8'%|%q  AHnM (/czB9 CUwn?uS]5&g+RiVw7z)Z_>7ߔgeOC9z&/<[ҽ,ohIL%x׽O|fw;y;pj6ܿ*盅$33 /!/FCMePQ(Yd_s &aRs5%p!oܓ{E})%oEy8(YRL+^SκOPe"kPqfDhG1BeBK(jhA6<iI-&8\$sfQcoz|M`ύUQjE.]ڕByYe{:+PNdpQ9zqu jԀ R흖?IUlP;xn`Ld<)C'"Bzx@%"q$Nxr8''i4trE%č^V~F>8-+֖FTr  we9jS8B5' ,E"g/=10/d3<ar>qDU zq8+L_|VCȍ̸7O {PTяyɽd%}Pm/F!CUF87+{!{4XY VMu8DDΎD\Sbɾ|OABcUF5]2Qq=$(lDCy=Ⱥ$8@ k_(sB+{|֛}d?l^UKJTd&R m/Uwtqͭ%J\F\d!QWht;fk͛l]<(y82ML5uƠisyz:ky*'bJk D1Odga*\`4eI8Bl O0i*w0 [1O.#?Br*(k>)V"g|=e4h;ԝArRыө}>ZgDue3?ʨENr΄h\疶x۰l;؞5E1aEJΒ [e( M%O2FX+5 Ԍ_z,IFIΗfvoyU9Ɇv8&:f8:QS0xgr$މY+3yDuwzteª1XIaiv~~_ 000(t4ImE$t)@RqQbYIZVZ!>SI ԮlDuR@9 Q ڋG+-U@rB1[AB'*J1D!IdPOBPyIW^#$rEy!]8sT dPLWϩzN8ZP: z  MC! ~bBw{=szQ(p?W71^}\￙" 4/TJ:]U86uK?_>>y<=Se"4K"\: Ɂg3az46=8kѸvy ]6^4 g&Lih;*btN(5g:哞~?Q9VUc难]𺂖vn?h܌[%#%RVDsk% O=GIE=qwn>3"B~8v?_;:$oZjךv}z|^fdA|!~z) >ɹ!{(U-Gy/k˻tÈaKOFԠFED@@4@KxOޟWGaer3* Զ<ĩ)m(s#zLaz٪,UenՉ6y|K/ !RH,5u9no6| {p}=,aފ{7p(5,>R) %kΡ\+ʏ>e7u,b5I!Ѡ Ԝ\^ۂ?]~#̥8Qj;~qOlG⾥L'!OhŸM]^BMwz?N |#_Wz`0aâX`Ip)rc;S& IRqKfXK0XV>G q0Qy_pӖzVk=X0d]O7'<7~o{#WgdBCwf󖮎n5ߍ@|Qp G e>L#f¥JCff\JkUnA.qR"L[Tg΀C `D0$B!2,&Ba)ɄHg 5\L8J)xu"ͰH)`Ya, D >-3.-Q=-Qc%Qɵ[p9e')8fMRe{ޘ8u`N062&ԑTB"'$D[f$cLێ,tb}9iJLy2*u 3e&u.?㞥^R I1{DQ"d %JԤa$e; Sk#dp/ApL{Sŭ CD j-<B Ĵ L#8"#R_A_!a.BVW"d?oB 3!Z3JB 0Ӽ,d'RPM3δ܈r j/ë$wTwj~BՌ퍅P҂G_Xf AbqJ!F$Ev$!"lƬ'Q3LC<3%fW=ih[Z+GLR gI9^Z}4Yr=dG(gKC% .!͂0xoUd,rGZ)(tYo^ iہ8)9fuLhe H엿ݐo [΅`}rՈ2=NpNq2  ʤ6xV8ckόVqbG½^bMG~!?3c:IBdtbʒ㔲kfɠar1q5>:hXbf)A~r /u /фB "8r0Yy@&`<: #a̓%*_# fI9ԠfR[;]k*f|fr ԽJ#)0m%FZ`C[@GJ||$Jv!7=A FouDV+IXv; 캜rG|9kK3!|ns'-HUiL_Q;q4P$U˥eG?.I_^htզſ9%EƁ7 E4/|9;o~y8ĂU,כMټQJ}W>+Į=T4o]8|1yVʻP9%X9ƥj7neaj Cj&C~U? g$+6{a xs08ـ 0#ƔBx6fneYѳ"guHZɻ4X.E@cV(38H֖P,F̍KtP]?\:P:&+_E "d/X0Vd}.4+qDj;0I241QN%(Q'&~O}^[,<7VH%"|?N-soo؟ AtWKEY^uUJ9cNs_%+\Zj7vzK^֧C7ؾ^mGISϫ _1#-Wl8d61{7 Kl[=+2zVdY]FQaХyB #!2r0ZƜ%*A'C{8*NQa׏Ih(epq)U&vh\GhrUF&Z4h9 YGE9d%2`bHfhk% m9iVbߒu^W?OWVQߕ*E8µNO#V,gF>+8%QJ]o3h9e)/:p1IUT@_a!$A+y)DmtZ2dnVJt4Q+Ô]{#a&,X>ªCӢN8NJrEYVz8MRߡMQTSѳkC*(BoҖk%z^ǰ$YᓢLzTqVE0cJv;ݸ.|ti h,[:{$[Z{phw9ܵfyEFa[Q?oӊ^?ڪՒ>!AE%+b◟dpzdG&g7Zry)Em8IPl*ٝu&y}2>HZ1{tv.+r~Mw\]O/5{F$m:*TAYlܪ tjd߮,jMz7hTLQ3F[35;3x/ qۜRGfMMV h;GIEM.l"ҷ/}!{X{WX fPȞj_1xٿ#'_ 8*}莋WM;vp^pI!,B ]ڐ Gӂ喿Ѥh;v68+l ۠qJIF" )#(:[iҁ*A㎫qK}HÒ&ڳ^U֢q9ȑFFX+Ai<]42pD PH6sTB!2!Qapnmnml+Ea) W#dRs,Q~aBqB0^ AOnb>DR{#,V,"֙ #R>;L Fܳd9 /M)@hL(*jX&}w֙fٌ![qL]!2\[4 50CLO<]hΨɅS/-s7Kg kwbݏojc1?})#~oQ~ ̮'Z#u[0k:[!1F_öNP W= 5~!7]a[psLk+mLUeM9I Qj7'tk2ƓV+>W4{FSq.E3_QEiQkЪoY}n@vұj Mj̻rh4*`Y!ecVe$, ]JiWG[@}Yy-(oXv1I)prB,aZ{Qpq=mUK t)|:ڈRy2(o; K7;,TE9k)jm /.kaG$Uie&J[F.yӹK|\H,y靆 k\HHH2Jr!7^P 9i~LFVjYLTsn[!<@hHO'}bʸ$@D#Zb*.>t:b}@m60Fhl[E??.ȶH8F\TN*' A!T~;ApAu ы")ˁ_-?N6E[)H!{@V[qokS2nMְ#6D îR>A|_AIfMW ,TqW9hg)qRu[5`ăz&h7&ިt 9"?)|?q4 BwC}z)1hWw>ĭ05bL1ٳ5 ^>]mtF2ׂA~|m LÇHn+aHX! Jp>._}MD`Fq<?M|st,+"wBL9T[q qxX)k9f~[)ч՜'x V vOۓ?eӝ0t BtFhMu] ~Q?[Zog0רb(4O=PaD*v)jAI2Me5 B'*MὨD a Gm35>'~yauOT|iAtrRҁ8=[]~i)|(BD|i?(!D )`B'L۹K,[*S.lIwUp'QKJ޲=%bۡ>ԣ$5RCv}cI:}UaJ$m9] cDME*>9D#ZX4 *1WJ`.k*V+5g]M \*Ḃl>>zaW'Ι*緯f.76a OKVγiFGsv9`95Kz%2oS9K;izl2LI09ΒЀzw'H rc|OJc-:Ioo29{+.,;& 9. `-]`y,3%LQB+ky#$BvV\jPUU :3 'w#(›/d<>[QǘJ^QФFEnUҥUn2h8nDA2 $%IN #N]x'oD.;NuMtĜ-Lh<)H>6ڔI/iue*$Tl+ו r'#~@T1--N"~F2M^nMEį>zpp*N٩qUaC>d4P렧ϒDݞH()<l^ibIdr,Z:ph A!0a[~cHpZd覛 Pm@* 岭ƕ3%Zm;-r:̯qLz`ɂyXk-BVH`0d~# Oƭ%@`L_13kH8&#A Q`*mLR$(q%kf#`*_)­jNyhp J+FE17?`bΎ>nt:%?s͊˙ocph j5%u_yxA%XhOJ0nt][-5jϷ\)2V<)~p{Y`bJHlqƷK%qJrd;!)!d*~1&2)a.%)bwNYluVҧw*-vw\f7n+oGbNXp^= 8Y ,> 1J#]$^HXA-'#]U.DNa0a;1:CP-暇]õRn',*XhTMe{'zu ކpQ"D+\TR%ݒpeJT3' @qe3tٿå]5|#=Pfb`Fe 8á`KA35ҵ˶L mr5#{)W9,Ȍ3P Y/ݲ" yTG ,iÃO%MS5/>8l@% 9PYl},]qZoo^~/l|U/;/wAzH WBk >8H .(-jcCiֿ`^疑W)~Ks {a buP^ fovJ@?go K4edՙGS?8∕7H k+G# sa o0@;-yO 8XVW81VHqOm8j8 yGS1PY^j$M7~0! ;"s La,(OSD]mL텉K@,dԀSQZD\Ȓ= T]7L7^_ bT}7Ѐ!2 2"|( Ayy@AMVa.RAjS>?() ŔQQ&HmTQ%Jj AIݝ; gtEg6wm+) #[;)=(_v(AhQLI8dR ib(adq @GQǛjM(%#cpf,)?̀pTkI Ouꪢr"Q^+f\[x4ce; _v|*;19y68óp yIWMO/ e/9OuYk_{0K/иկs꟣\>88\J^goDoo<[++?.?]JóNE ziFߓmWk옆O_"QohNn޼Jm_,~l<>Wfז}qvb{Kr$Σ?O@#Z3_WQx5B} W7oϮ˕.fWXH$ןkr}zٶү|q Y`Ԡ|wF=fWl]EG/@"ݡN\ݡ1+7&d^` BkCZ^GǎRj ":FЛ (wMt;A]J(^uLv$VgڻevvF)Cq͚Y-i^7^^o߾yfYrNq5KZ0zGKhZr!DSrJ^v3<!TLPJL1R+x:x*xI yoU¨hض7趂$r9dtJ:Y%lt>7XW+7ԫ\ h픻 zy[Dv){:&j&,$- \ɬ&r !p=a [.DMzBlo"? ι1awָ_+̬VY0Zaf {} n3[XON*5\Ӹ˝Sp̈́VW{ :f}9)nW! 9@ܣl()nqaU8UQpQQ`q%:Ql`mF ](r^zQCIDt[?zA|w(MUY*aUe=QT,F@CݜO$zNPʡP!^ N ܾ 76jc}3~ -P.z̺cr|Qt<#X~6tV Lk^-|pͶ9rNq!Z8G_V/-G! O-8r#a7%ޖ;IcU'U4VuآNZU`\E0_QawX-ȸ pˊq ʬRLSc^ qQ#Mbsq@&LmHVKa) Th*\{!%@7[\FoY.Ez L^P:.IporW.SC kt{w+?ww:t?^Y}%ع:@[vex ="+Χ>n죬+ څZXKZ#-(5捀{XFm6 ;((Qv[y & ]b{4qXy~t NvB`[f 0j.81˒y[yiaޡ$z1i3?4e8bdK~y̕YU>fU>|MlQUF<$8 .Fz/gsFA:RxtV+ɹ0 6(rVO`eJi (]2)h H88gɐs22/M M%+Q=M/oM,<eq;WtAd΍DeS?ʖPJʩDM%ʻ' x=2rk%J jj%pEB;W_A6Sr*\]'$L%ʧ3/oMQ17%ծ:"ej)N x-iD=Հ'J+lNŨxu7SlK7y|z$Q6YU%A,^ =ۋ0{'1I"xzZ~̎5xek>un%8fW:2 &dk.L-:풌""(P\BobbUǒ:$Szo~T'YIVo՛d6(w =kh[ ~>֋ڛQ2Urd AsL҇*sKZoH_Gh-n;O$`I{ۧÃn̑~4>)T8.:Lb} EsP[q*f&"O1!&$ofMxzw ߌmeP FY2-8T"ˈ%8_Э4e-+!j}M$WcE-Pb@}I%A.㌃p>?*U}|1ڝJ;@iKDUVZТ(% "qif,$dKdX1/$t"ڄFBLhkc^jS4Eu;xH'dE04@"dcV\\AC$~8$zu׺ݫМI f]q}5sy`k߿/~_fX(4w<ҷ-ㅳUgUm}GDQ  U'7\t!*9zDF6`z{Х],jitqw3VYBµCלv5)O(#$5FqEw.|ь#J(3BPP@sNQE@!de*FL2Ij$i.B& qVm5$ii>0(I~K_nxD8!0|V|1"W.IXbtԔLX^1@vhP%@ @#zBmK.iϠTv|c+(4r: S:G##@;zD}Za1N1\}H4t}5QQk%'ιonuЃ|YA| :])ٹ<xgLLG)3  N1Sh*R"&VBCUc#7\"1h Y? d@dTu65)_̎N(="逅3wf5&ȈS4@ PDâ֜ktXr!d5 {O"8UdBn책LN7oLujx黻]:qbe?ibAd`kPq 5.fKr\*e ȁKzJXo(8n>$!E(vQ92 k$+IZ<>$WYNĊ3Q{jk<*9-1Q1Xl˃r"EH[ 8\TIF0]mo9+? _E@> &۝;M8ǒ~U-˖T;NFY$NlUXE h<f#fe]Bی]Z>ه/Z[rf &}vH)+?;dgRܛi(yfH!ekLc-9BxQҼ&p۶1GLůr};0xugB!H0jd*;Hˀ#JQm j8*mSӊ{ҧAѸPRĄF$٬W8|އV)Iyt3#H0JʎR4ׂ jʕnƤL^>=;oG%K.PV4 :(~#\S ªNB+\rI`|s3kll˘(PM2k]dRHOX0kTV%־Kr[;Z-4QRB|{DQỶ(IR$iQ {ݶG%qG(:7k$@($)]1i(KI–R{V]a!d`*hu$(IJ1m˘^6zTHxBn_d#85].? ڒgG falʣvn1(1Y2/ مggj,e|mF䀄9wrQE;ٳS}`pڶ1%8@+ǁ^-՘=޻ Y# )b-D6:"ZB$%ܓh;#Lȶ:eL>$tI II)AR uʑ ۄvHJ5_FwV9ķ|$"DhQHG) l^7RIh%>/ kmH%R\}kfZR\ι-t>IaUOm(N.ӐUh=96i~YMs ȜN/Gi8E >2W>]ěi51 EV{0Yc |1=f6cTK6lo+|jqԼ[O?LDOY# v :EDv+&\'kOŤZC˒N,H;/Wggyqqy nfWE2Hoi5% 鴕gNiC:Y/B(78y]7P }1 _]]]:ȣ*\)+k- H%y˨CrH!hQk-T13~P/A4[1Ȼ=@QO[N̗=u'K\6Kƣg^-z-UCGUT rS r$#z=d`Nypn$rHfɌȾԷrKd9iwX?lhG#vѐV$Z Px b$M-aʂ2XO$_[U*R]ULRGn)c@CXuB eN2S R^1 G`G=BfζuHqůGԥRHUJpڒ0 teK5 u \ PkC*YM)$D ұ" 1Va嫺H"m?ꧥ@Wź MCvM jE*(B'OVԝC*D U?TLe)9!a]EmKڧjY QB P{Ά7(1~7f TVri&ѥiK\OBW.8[7CV3J]e5k.+YY-N-5Gڣ-@]oԶVX+9n+ǭZ0$ʪ4R~3rx{HraZOњd:tΘJ((QR[( ӊ1zԌ޹(FcQq>2!$0QAkST` t3PbrAƲp1$ݩ>O!m2 mLV iX_ Q)#I0~_,#_Zad p#)=|?X/l[?XG5sI Y(慛5ڨcR03p8y`/ݶ I␑էyj Y=#4R^젭q~3Hp(WqU͆Rm#&}wqϝvG7X }vj<_o=H${]\w\+O'=T?ǜ׷7iru]>sZL~99_,O_NOsLAv8tgd'g|L<~9g=*~~Gf^tֈ 'OV XS*{4rA,dM骔( .yI`el)O,7OZG,qg1Lso w-o*-We9 7o6=Tt1S.j bâ dy+mmE}\0͗@]m`vO *nDxFW'[-AK+CS!C-o_jXy|UC黓LuY}ł *8["˯;-W҇G_ U)WuKhn@˧l(}5{s*EwTs#y͂;>ffqN{?44n H'> ?WedOŏW1}T`~X)}9 *Tu(~y}s3A^4>=S'C_O^T{̣hfMysu(B9 7>Ӵ*uKf>7)J֮1P ~Nʺ1K/(#ͩ9!E5 Bg}4^J8-LNʂRDA_WQ])SK}GqB *g'ˬM_m|x}T#s .@F e* ʙ|e3 |哞2GvKaGM:Nxy}+% tqy\O+1Zf"c$*R藲@a+}yipQ;D )R9v}wIU*y&RPq!#' !2=<_?fVۭ**ԞQ]}6(froM lvJ]|e6:YLM =T˛\ˑ;{j{rghU ĕv\"ˬ:^Og^tAU.g_.rqq=__m{ŽI SQ=1jOH[z:1v?3ʽs"*4WӽXo1:Qq(3+8qY*H⸪lVEA.Ԧc|s'$3g̓igpzXZA-0aVYVaU=y9+@~g$MѶe~/ez!{SDE괰Ct9v|t(EwPGݻ(bK=b1qcYq:2S2ǐ"N/c$E$.xvDʆ{yF>8Bۛ `y¹yEė_ÁTM?  }sX٘Ƚ*R|9ײeW\ptQ?P6?蛗s>,h?IX^ϯUC 7ve^5в8^wߏכiyv[g& XMQ#MTbf Q,3SBԏ}JEG 83g%5 cBt bB41L[`HȞ$5`EɁ'un1r)pj³5y[68TN)ZXY\&2U<򡋍ʬIE_m:5caj2!@=j}ZxK.]g!H>"%8Ӫ3I*_@VofUfSQipLP:ORK(Mr'U)i.U,9Q\, I IJGx3n@GU$ͯۃ*B òtg;ow6ӻ*^L۶fT>VOjLӟ[Ia}9abd9hu) xE7y|D.*Wxym:, .?!=?}zD[a>I4_淧n-ūX1a>~=?~LDv/ GD@%Z>[v[P'Or6z|vG jL`3JRv 51&J9V | ED\yX3zԠ}sKf6̦@|!ş!gG\0U֕SX}%Pߞ$g( M;Hh}Z}ԋr)/Xnޝ0ڤ'ۙ=yv;'r$Lo E)+e.7B1'[B1+߰撂/Q׾$BN'{XT\&a2Gnѝ g`3(#͹do *yfG^,EÈ `\zZ a*wCyI1`']n`BFG;SE^vV:v+{ݴ2=)Ɇ}bx*t.e*O#Q̓X'DIDD pIl,Ic48H"QE⸡|=@X9 T9Ɗ˒HtV^jʼnq}\-:Pw.MqBɊqPIvyl(Xv4KQF@ acN$-W ged`jIS צ1fCj4@Z.@&mPZZE;a$Ls"lm(Cj]>)&C h4cȐAtN& "t@*C蕞H"|ނa9~锕! t eMBᴰOx|SqEy=MqIe|Mjq$ҡ:kㅙQ5]Rm,c{Sm UfC4U!̄>G0_le,%;t5۬Y9xtK.9i)nE[(^Z|ۛdYQb<⠱WZZ)/#VjĄRՠ:r ?(h )ʲDn}v>Zlmkl٣>KD̾>lG"E²{)xڗ28| '옆E}e{g:!YR{Q?ƨH<O8vU#Atܗ~?AWJXam4!?Ε 1%% xz9ItC$P#wKa SĆ߀:w. Mʹn'+atot>C :pZXؒŏ;,j'1&*_'WN'>>Bo^Swa-/|_kclEL'f/1iUĆnLeiHUQ1RVjYO@@-7?b"\)#*.v LH'"Jy뉆(I&.~N.bVve7 QEt\JlL(3T!D4y%F,$2Y&WGPwrFIc Q[gw)l"x6TBVJ q\yf8nN (XLtOIxsP 1H}h &AQDq9MQ$cfr#I0/BM fVk*vlfَ+7hh}j^tbn:^DԢIR8O*tpɑ7~''rJ)9fe,/ΩКf괶H>^e $/ T`۱1Vg\y0K8 ~lוAveCbPO+RC;B2sE>S?7` b Ov|!ؔ K‚ N JaB.M~kQ6.ch98s}$Y Rzގ}qAkEhH@DB"8@8QO44.fYOʫK c7snqЧ EZ0(-n[8&o"?۞} KAQ|5e*>xoxؒZFB!Z;>Ĉ<`VClhʼnn!bRJYޒ&~rmFoVt}(ke*~2Z_o)Z͍0(ŭRJӇ$Vhzj&`K%213M19ZiDdř!QOOg3L9S}-] ,P=9ף_#ZG|aZL/E`>Λn:iqc :0a]|tNp6Y>DSSͯ7?>Rcgw/QWafzB 5@%.XJ+ߘVFU~:Z/} z_o2kV:@Hm%D*tD%QB&DGcj5R[,UDq`ă|t]'jcpk>jMV*%{2AD]}g@X4hjRВB1$Y5QHg+CÊA. j֊Vka2pE9V+C r@b BeKV8fvFcR:-~4P3Gw/?fK(krp’7R Cz? (N<,A)pdIOP;D{&voIxY0oم tf0 f"uB~F֔Թ0]a7 8G56t5:&ܯgљqfg3̂lc6_tios2g3FEu{"-d,->|x߇3?]koH+p1̚v3fAb+z8q[MJ2-1eJg$fwTu)vus+;3=/`'Vs,?Kϒ{dsPxԛ- \91SW"vo(´+mṯ;=~-{=.ZɫD,Nz5^9gj21dSq86a GP+e =/,?l~|n삲 QL]AMP&[ڼ?(q}0u]vD2A$Z O1;NKa^n1]mm}з<w`ͤah|6iLfe]ha Xptk;p%}8b~% aeй8 Yp2}H;; j-sd1 W'x+[0ah$`5읧gI6wwWL?1C<[{AR7巽s8 3֎Oů;ӛǎ!d uxt\Wʎ`~ˎ*p{*Gv4T-c'ՔNOE@l \楘s K%* c/CltgSoB}yŻ0k+3&-|{ W,ILwX+K'OZ_çYw`ד˗~wzu%e :Y?3~R|~2SԜGA+Ɵzx9+}(ޛKypdr\Pxe:@tò5ylr wM%-ʵ|.Ր6WtR~{ ' nC~6^?pm@1υ>L%R4Hh A0 T༎ʉc3267NGG<x֟G/_~gy. Fܳ f_NGIIlw36Hpg^ϹN8S@ pɅ9M=@Us4I(j<(/(\A쇟'eݰ?rjA(_bd= zӽ)K6ht4Ͼ:[lSow ^r~=ෛ]22b^kǗqQٺ&ݯJJ|pn^.yWFw.+{粁`%%CTg {2*?ˣ l^ ]5@3 0d H4"_NAcw49}3R y>fpRqy*{w4a"fxwzL^u$P|B]$f:'Sc{ [ - 5miӀlpȊkO(6qc \ZIAy Xh"<̐LhVPyϺ߻CQ`M WiV`~!V;mNbӆXXe4(6XoP.`VYeUhD@ZU6ݺ 2ݺt%goT2{kPA;h1EaOY+H1ߢk뇾˗דJs[t˗kZ_DEnN7N>Lr~"+2IvkWG^%-i*][qq >^|vK,*u?^FA>p>Z5=%o [r8U(A^x],yT0^]AAw}`AVA}|΁ILW~h\pm0 [lEoF:Yx9V+lb0Xmƫ;(ͽFq]χ5uDULOxoSki_$Fv zE%UzKPROIB vmP[uvsg7=.QٮF%#5zZYE7-%³NFsFCo# 1S˯G<)o:4p)8>8;1w>o %Vb/TIYb]{Z]wYZsuaiU"S=Jib$+ӎA$>8~3o8l_\=ͥ(Ϳge\LǴF.OH=L9Sqh glz5LtQn)A'޴j5nԸ^"ψD;}4e6Iicl3;@V%bvyȳ 6QTsE)pa9p( „@Ddk :Lfii>j ̒-g_Z,,,ouk[ yh\hZkqb[ɵvI'&HC`HLX&HJpf cB%L '91*r9x=xl$82iCQ(w sf8Kwz 4Ձ@=)CpαGub@ [7=Xz$mn>ݘ"ɔ>cR3,IAjQIE4@Er#[t+FcPJ `SC[:j6(*Gt2",3kxVTܒU ,j^J+14F;ͭQ R̅"Z 5-nMז^ Ȼz)W((;lC*h,#<:! )Qg)Uj"W$m9%5GT뚖Ge30p_Lഈ( 7/"qw^syy$R.?'OכXr3)/tm6!-?K7!OsI[tg7޳S",nHI17=32n}kN[ D1Cz6@Un.鏁RZp0G164uαN+ZgL0,ֽMR2-hJrS-!ڱũ}`$+c5v X^ Q⚬[%p Lmfq :> 0,PjXOSd.^G''-iU|{z2^ۛ6^6x?DZme0ܑxS߅q7iG"'ye_H2(2 @q9.b[a,`¼@]ˇSfvLڵjH-(|-+TA- 䎾::;RS7]}i>87p=HJrwڧqb3H-1yAzO8DA:Kƴ|KiqȻhP+%uBG)wRpGUHa 5scdIϵg{NRl#c!I<K ~+ ;JXwU܉}5l%Fc‡(eLz';'BhDh_<8P]:b Fz1wG"( G%OS<(   (%E00#PQ+밊3uhhjn:qyt)2 OSJidtYLr uX:f?fa"mn6_re@}nR{seˇKZZJ!)}4HJ3 煤Ĥ[5yn Wڌ~DBKƓ%3^\ŔeR- 9͘>JOB@2mpW@eXVI,B#Cі,y0w^I$,d/"G)ؑꢏ+Hnf1"DdmRHQ=Ev4\3:>,B#)$WgqY2jPC}Vb?YOhLĈM6FS6C_D0Ju#[Z.'"γV`IPJ yz|𞌄@qN& 2C䒸ћL`}G[EDЭ TG;` 5z\ˍfth  W- 0\[b}Ym 3l3'UIڌ[|hI\ 0' CYTV/7l@b?Xki9SK(%mHf։^cX?ZzLkd$9rȧP;N4Nk3xS,7uSK|\D@Wd,-{644}J3SLh$KH'':zkl4i6 4$_VX[R4H/k"aEj'W" Hތ倍%ےglj SP~׃;' .4de|"Li(3zWۢfc`]Q#yR$dYM1Qtl75RH>-QfTےgr2 0f#%d=cad9!du,!du#Q[spCfϋ(@X!73/ƛTpFڳFV^縯grY#+2:5҄)65Qu' ǵ'5c Kj$@Y=Λ hNs$ ^_sZн B'fEdp|4h(Prt-SRPRи p䅛ב 9fQe,ϥuQ;0FE$$u͕Y.\Jo w[PtӮVnٍSSq}a;OpmhHU<`!2=H 2FjFouԩTpIb4d-`L}|ީ Q0Iׂ(R};OpYi[uAtjƤ}'Ѝ۷<ƐK'q[iɉb7Oؖ=h6Ij@\vG& 0JwôR:Oa 5҆a}FsɎf{[T¶Zk)9xĥqvyo%s? 7fwjh|}3A=aMo׳F,Zc&5nV~~umXQ^"=K8_ےhfJz߫(rؑqm%SȎ^'oi7+ #nm1#:mhyZY0ݺ7.{˔T9½Eu5Tw~4d+0&1i=:65A !s(QQx&s [B0nO 3˴#e?lHνw"Fi#XU2 ,d1-O`9mb49J&GE"`T/Nì5%AЮPVmbXL⎜Q VA랫R Ըt QPpHU&[O.kEH]8rѠ =M4!V2tK?6)2  sP. MZ2jSt`Fԃ1xގ c^61SOjc$  0O V7┥ǴA4'[m'av䇸&Qn YIhOЖ9ȻioA:"+7eV{^ѷ):3'bڗ |9_nս6K{mD=Od*Ĩj0ܧRUG90m=hխhP#/ނtrEo:-xZ~o WJcwۓPp"ӟ~OPK'ƒ86JN^>kϒAIy6(ņ#ΰhEϓ۾HW!%}!͉Tk@j4~Z_Mm "_GkqT=~'ӓ㧎w߮ ï5p|3+jZt7FL˧/sBhVFxE{ Q)5߼͟N&;[/wF{l7e*|׀+Fĕ[>> kpl18Yǎ?VD@ 93~|jQU=XETal UtlÎ1\H G Wu~@å\|@?C%z:m؂}d۟opΌ؉x$a7Ak'[W@u6呬{ Ojǜ}RGb)VBvI=__# ܞf^[ɮK*0rI-w.gi'(seĵRK &R1a;,L@&8gUl/7wS}{mnX~$;S&8aQs; |"m2pc'b> eoy+ݶ3vn7 s!k~WbFIXNR3|5߷ p#ӎgu|n>ue0P+OkbǙvr`Gv,UR~ W{_KttJ0GWvJ7 6k1P 8+u:2XB|Wd/)6u5R[~4w¯АxZB#~a5\X)xrc){K yk /sE 1^:_N9x "r\#ɳ}1p};;7z[FsqJ7ewJ9f HWE?uaA ~>6l# _JX- gbb`µƇRK}]/>s1f.@ lUkZLyUS彁[sAc_ :!DMBy)SRx;[zK/?.֮oQۅ.Fnq?I&ɂ\ܓ4V_5GOŧѵob`Jݰ)h6p}X½~2/$";w?U8+Zѳ[s?8F ^Hdwd#X) GpĭQ/S>eGL|'2Q,y3E]YsIr+uSu['޵&ا`ԩFwg5q4Q QeVVfVf\h8#*'$( ɲ҉w[bD7>/h/E:Hg3ڦ [X4R!DdvG ; 8y.3Ct ҉ĸtPb)HppFN@@llTIߞ$v`.~Y?^``3']F-'}2y$yPJ9X^Lr+GrJ3 P2F& nVTxtuV+Xn!Śs4F4#2: ZҰAD@?0Ki7`F%P͎/xIJ,!GrVd:A8eZpLJ$`A%#$%!A4[$ߏ qM?頑ӗOZ˂m_"'\tk[O]D=bnX_\iv_.w)gj泡JϢv^EG~vk8~IMIQ׿cȻz7`b4\3 "/~~辻tΏ//%{}?U{_7si&!On !R>~{t/!t`L|c.P4R%-:a}wk 'R$w{~ԙdj?j,?,WI1^*VVpCA+9tP95C8!J ~e^ /"،,O v c5"Rw2=ZzptIkB38yMу &Q n9 FIQ: D'1p Wa=iϧ5UyoJHm  )Bpp 5څlJEmz޸yD3G4ؒD pfϤKäˢw-IxJt(06!ۓܗK<'`w%}eXr%Ȭ$)0%S…siMr#X.76ebW:k:k \B3Խ,<eVf mٚ7+0_O]5P;;M?-%ݪr]vhEB^J*&M Gf8V-s, ۖPV4mALcs>$qWAD\1I=3帋TẔƠmzypF cǫIn,3NPr ҋ;RwE[^z7)xMnQX2nJ=!X^ ڲ:ϟTK&e{dnrO8K`'vK (4x (Jme ^Ft<-\H 8"Ҙ 𾲚JXF`6m&2ܲ։m{sJOr?x>wt]zj^TЃ0&[Vݼ4j&JmgY\v,9p)i?E_=\^0m\|hh) Xy nZ(o9AWZE>ш[sJ ཥbs%Os#5B0P)5Vbs$Yne8zE AK1R q@``n1EZBT_Tͥ; = <<,Yyɉ?+46d+*"D!Ј5O4SUP#!E:H`yl/#!+yFpi*$#SL93IL-E+4RԵZe7/~;4\21Lm9t!!֑ B"Ɖ>TD'TAߔʂ7U?,smyI'0w0_4Yt4!*Ҏ؊pO(>_)^m$X1'ө?lř9QȨ%vYYzH\ ||fO֮?NfM--A9mAJN%*OGM>4~UF^7$I>!F-vOg+']*@+?;#xO'8+)i M:i' ib pe#mmw=,n{uɣMp6"|P\b"x^  n)tݰSTkMX -39MyΰD"/aH21'@AE2:-l429^Q6Iz0%HhO7F]P<Yf-&,q%Qbn=w_2#k,K5"$Q56JL%Ұl(FUKQ0cq–a\V}wpU(3sW[@8iƝ/9\4x5tڈ_7ʕDݸիg}j1*@Zj!X[3˖6CA֦6 PdZG ~G؉<,؈,2P$˄V@YV%W}iSNDW˭-k>;k&JԄpB VnU?McjU`8 ~پ4[Ip=M|=rL_|q =aL]˚[I'CJ^.R".RXJNv9ج-3q !L0ح*o}؋($O.,ѺpbLS¤){YĶY>kzl~N=i;`ô3Mu+znn='ߤG%rx@KULp&PB{ln&!G V?xM؊(F*"r^cEX|e7"9#nvǻt\zbv]hK:pg_ÆᄎjgBgÂ/`'S{#=%>5ـ 2߭#+ U)vsSCCoǰ޷zot}3EHVh3X_'"w֤G); `/]0yoBגPu:e@M:yIdjoFc%8rgl3ޤsqĆ4!2.GRƕMq`\?w2]M91zWV~O|Z0FRY}]~a @S_ Ť@Y=LEɼ]\k_l`Mp%cЧpjÉf<$>fy:XM[`ΩYIz֋`A+ ?01&rgRbED{CDŽAhkFx̩רqٹO&s-'fAv-EqzLЭ2&_'bU%KzY{RTz#995A˒ 62r31% id8ӗeVSnÚ~)u2|Kd-;91(cu]Xx"(Ba[6VߏVZj:XsAbTVIثF`QkMEZ]0LT.-%`|K*R`ـa<<T `$ 8.jK̜?6ڨXnwF*},{-+q@\k,Q!Rdz‰zbۻ_Q+6[;ŲfT{-z;p[-s4 ynYu[<@PN=8 2!|/ Oyr kϋH}:Y@,/Å2hr$0BeYd6Fba0)&Cx$Okscb Ԟ,4=jN4;)xL`ٱ:QD|ռWWL[Yi㨙$Yr]ŘRccڣ|yކzQ2B󎱝w=yy2yqyyKzBђqS0wooޜb:@|? w Q"`}^x7^׏l@GѾ{x F@-r[Yy2S{wIy`̛GZ[ٵy3{|=ʦΥ$eiI&<&<&S!7uB^ 9IPos ];^GPdKnh%Ң]`'ѡnj~n zO݄^p\Gb~|K;aL˃uC,d뺡MYgQ*P1Ϛ6$8bpNUl9jH8G{\޻(UkKZE#^| ;,SvsŽ{SNqv!ҍ PZm)ϑF AVrΤ=/S ( xEnmx |AI%`Porg'K7L8&\"1zzDR&2wZxȕK2E02x#!UZa E)u3ʵ$n`1fU4Xz/.I҈GӣnRܫ=J'(QGV(K֦h[yj? +-ºٻ8n$W:☈-هve78xݲdMovF4,FD"dT-hMX;q$N@A s\#S24Xb@NR͋[ι3 Jeٌ}P 7%30KrgWKEuG/"r|!8 ~z_&[+*KbJJ6(̱tܺ/6cخ,0+.$ϲϑ%G4RhO]}&ݚx0 Gqy՗Os"z9 HTŐs %U4d|996shodE}6ٖ9YB6>{,qgRk޿`9RZr 1cڳɹwW#fStJmXŴ*T{i^r3M@z" `+Eht)K.@C j:|-Kqʲʉ 3J9\y!.s0'!aJsWN,Zv uk}-k+̙呏YOyS2nS7|M6%MUw/$ afC5DYԌӨ%F"DhbTE(E=v ZWHV̕O2밾GnLƷCMVOk&s~q\pa|FfWAaHJ{f@帋TzPe`$( X2A?G˜T5TfP&2@RnlIXBE)MP%28֪!$ū+Y x>"7="uH9ArFa"( W%&p!PEL$PY$bޫhq?HLT>wp1WhUN sLR( FaZ) Qh["J\X>>}zm% v`Oz̨M 7n 9^EYD= 0` H"woPPNYf-&^0RtSTGJ G8]3&pogS %=Z>BRu-53[Y i@Ϸ~cH8:ϔI:(E:n]P^ H KÉwp~=狩ϓ|bS$xyUZ|7/^!9.[T]\` pi9nFV+ uN; RŨ{BS RFE{ d)=Z /qA䇻>iN\rNIjTs.lS`l`w\YdxHG]vn gUس`ʚIX,Q{IbY %߇`,s`Qm^(pJO69k'A(rI:19,%Pρvl{#k0:+jl{K7n{Zd& \#voa tGXtΘ*hXF2!=s,Y1΢MMC#Z#.ūyϛMKkoa̱뫢:ĝmk݈dA3ilY/ۆ*6ps3.Hn^pU7IeX_)I>%&KCc}b|%Ub V3Oϋ3kϑo~Yt4Oxx9f}dUE]1m\WGXYz񜨟Yxyzy,o|byR{[Qrľq|6Kߎ4ٞW\_ R`PRs\%q4UWuQOS SGm*EH3C1ns IϚvj  N+(qt֢p*]fAQxq䤔ⷕR2T {,_RuV/h&>n|:5H2URE4$@`Rsng-tGuwq~VZ( q/AN@y&%a@"i$) !/6ploIk杘Yc73PhN/w oɇ0Z2ApO%h z VX*EsSL uzȧ } ?*A!EF`&kU`"zly% KI.1TcN HDJK m%WfGy@rF1E7z,_cI#6XBD6ɝ/ DF 1!\ԻƁ =ǚA@~n&Ӻ +Db OTu h]ZR| ep6KNc\+pF!֎m3Ao~S_na$?~!ɚy꾁K1e=gvX+t0_c" _(]\~>bxMN!~$cvlTwwAf<^`;e'dOf##`T &V`aE9;@OB6/$9 D;" 7j W5X㺫y;/Mș/L[piB.Ze9R;G,Jfb,,։gv".V|vZk ?F_)S3ѩhbX Zi¹ꫲU \k; gmQɥB;b?7XM+ԛ&}%ԩ>*T\Y$XЖP׾z(cF- 1_BXK[Q (=ܡ(MIH7T). U׳Z7 !  7L jLEC߰cZDb^uc>Şey}fq>t.@kND~Fk^ u(s3 CNCA~ 2a\r9NX M䀅(ڳ˪9.ވ#=`/6b^Y>9OLعژy@TSf!!VFmFуdthG)d8/9:=M18ʕH{#=у8Gto퓞 ʓO$OK|jBPoC;L4N;ڭo?eeL !d9NY9'AQ.,3)K9?TڮtRO~wo=eIVuޜjPwiTшm`kcB:,+$G:+$uU 5 N GS^Wj½WDa^Ͻ);u9[Ndbt{fst>zѼlGrtN8' \px`*Bkc˦ŏb#0Iz0O6\1nCn [bۻwoQo5qy3AbEAG:.&;).:gИFo(/yBG)wRpGUHa 5ʐ7jIϵgR^B"[r`+ ';/ ,۪[^d /ᩙRIIoUGߡs׹ze͂-"zyID0'TttFQ=+̈+4uHsS'mL uT{& ŜQAZuT GZD0Ѵv\PgTxt z[dU'D@tUh|>d% WW8xAc|ML̑f`fE+t`.o-9H>jKK+! @tV7.A gAntJF!$CTfxq0c`u PS\k0wQlk +򩓟mn4 )(ux% ٻ^7rW8e/F`b{ ad_v`ֶ6fI3[ӺG౥X_U,_L/BAQGl) J3jeHxN8h,Qwl'EA<^ǿ@5f0)P&&QJ:XRSX3j$Oz!̜QQN"b$¼ qIROJI43:,JcH; J{9=sZ1=-{jzq.L8wYq`aZNpЋj%t:鱳K-Rd lZ[[[,, Emht}_ZH._}y~q%@~z*Z|u$]&q8хa:޻KroH"JjlЀX+3X o.r/"++ּQ&hpph*vkWsxL hNA`J &P&\+am N|ANrbT5,ǰ 9ϐNjk] Erf2fQ.4Fg?lJ-hmS!u7pYNn U-%xL^'wn^2|0uo @b,%{"O< El߆iQuܖTBwG2G{.-v6dsp5{kYV-̇p0\6C>dd<-^ +'IpyF !N`re^:|egʿLB[vˌ]pM_kyRTכj짻\']"$pj׃J_NlKfExZ;jDkg"W^_xR.wYO gß@U8$fVc]qN]8Cq@xfėT35q`N)lrG :JI Qi U4UigiQ5$ 5@!\۫vr9ӽ[a5@Kͭy/cRZw/qUˡȤa-f')ܹu8Wg2 \7uU*?M`t} NT!q-Na>c쥓n=ݐٗZK&?"~Q+YmEYRlJ4(8nG Ւ:+>f-=x$x$ TGe~ԘZR creBA3" 5xnHL,?`HkB&eSޚ49얫}Gq`*Po[vGSg6,nY6ulcĹ= A侣vU-[7ѣmPFk۽Bf*Vo{D,bIa+ )ɒ PQkwyTb^c bO@S)-<[s1xh~J*rDӼy6[r/nG L_,_L%Gd!nf?ı;<-K3XUqݒN7a 4޲j ˏ*ч2v7j.gSÅ -d3zLByd#m3#)-<‰#sV/PGj'_6UoFu;- #Ȼ9Nzmw$ͦT2-0u5q冚Zމ-Xay!(0)@9 ې`fo 6 e'#b7w )Ŭm:-Q9R^pu%%eՙxJ\鳖Gf֮#ϝb Sn6n~9~H›vH KDX>]}GUV{}&FtH 1'ˈi@uJH9 PH1e4DEqY'X;x%Fb\ar"cHT3&xSЀjFS8&FQoK齓)J% s$\ nq$JXv%$Hkdz/|$4):Z۹cu3}Wp3,&ϳj{#eWk.n&AuG\|J&2RuŁb h =m_G:.kT,$lgkAc >{օ/ L}貢BDL tOv1.iFŹ6D DbYrmYŪ}ĭn}8 (y #eIR`B8xJ2OY&PBr45|4{k$=m$:=/_I7Ӟw<}Euu2t:K_I2t4n{}|1!m.) \Z5^|B:",ߩ3\.bXد;*thu3UmDC#b0ݺMP>3T‡EZW~}D#ߙ=8ck,SOgSU۳far};5ݏLygCE2 M Bt4]nnİ V4tnXݕ&m&l]tT +NuuNO${2*EE/')k "LiZ)j ъq+s$,a=!)p)ԁF e0̘j0Yrn!F-$!rM%Hy}ŪoIG j0+\.,ˎ*x>/i-87C|+)5=~i˿pz*$A9LgItoQ%r*#YLg?.iN8LphlSv77TVss`ׄcY5bk/f>MnsmFB|07&+DTN4AcB:2G'JsxcŃ @-7)X7Hx{L f&cqN;7tm<$&CICꟳ1Y)fo BKBu_~-sE[>^tqFe]xL=]xT 8@70Z4iUR_oz3Rrn( (- Pxpd U 4|SbVp_K,VYǬC 9*9"k5kw rJԤXQX ^yKAD 3k"0fVba6jQC%К>[YD~jRҖ{dLY ;00-K=8<|&q|%1]YǑ+y[[}( pZ)$K~uRYfЃ HaU_feef呓aE1ak[Kg5 ]Uw  V(_]e@wjYEc ۋkaJbr*-j'+<h5$*WeŜP;{0BBPZJBeր bNQUҋPS$xt:Z>>3S-P4ƣ{pC >FT PexjfBV3)!Qx{cK :~n!N~$;h;0I *g/RZH(ܘϦ޵蟢:ȯۙ`Ѳ6u]]l TμNupz 2,e0IE{bHi ub+XD~;l0R!"CW 7C$AiS }qg/'颸ãst\%窪T9*?]|9.88RXí!ɎUuT#1 ˛ps7",5pŀ̧GI6R3-$.>io]5SUc%(w_s2\zdgDMZl?\U04> yYmmyjs.J0t*O:\>!yzo>BX^b!Z%D^*WF I2:EqjVX/ ҙJmjF-HVbYgeblxV O)^8DO^N Xoyr6P0~5%^P~Ift/Op2n_I݀]/^%+t|HSͿê˜+DSuƃ1?,~|4ːֳ-MEN5S-]bBypć-OBWynf_'%:H;)U܀Biա'[ja֗Є:4VD&&ip:u>x4w/6`eM)bB:ϋuK[;'Nj*GRi)u0gn~mY賤xj}ytjⅪ7{Wï_U92EБp(356\ L$>4&5 EE J_|s 3:f2fsݝ!s2JTBpja馫y{D2HlRȔHLqu`lbƈOН!/cdy;z +l)uf˵pf2gk$6"__̷ +׽}_x:ufH֐ܙHf/\pU WսPcsIc|)3CY/wkF.ZwR3!Y*TǔH)}LD0RAIH9@)\{@JO,h'>#|ŒfIG@V ; Ke!dDPN[:ޚK]2}轝/\/8VQ{^HW!$ZQ<d}e Q6(-%LW.ilf=gZ4tdIdqKpثYa֘dkqBG(4Wd]zLV+Qm=oA_Q̆DY,)\ 65)q۠BÎ YI$NI~wF[})"TGuPZ߫)8d&u L%JznDRVĆX\*=Dd(94*]$ A`#KnGk497hg4"&pz[{Sϸ(hT(Lcld{Rt%84uH7/sO#4+L#unA sifFNwQanzABB)(J+oLq^8YyJ1`C/NJExW!B`aJt~ ΟȟDhOa⹂,ø夭b@D cTdi"5HaR8#e+6b̹ ntjGSG$vG&(6Ȓ!= `.TYǬ\tEe/ ASo ɂn^ ^НNs&lunlVvPt fָ߶n}F"~X?NW-;ݗ^w[2_q& "!+[#Pg!BJS'<*sjAugn~6{?,^h=OV/9ѳ,,Z2Ddwr"h)^MyhMR [zX՟ UzZPZ|֜峃ONh)Ehdo0>JzؑZUC-U2h88FF}tY'`)**Ib3`45GJT֋R/&ZثV%8ލVc5ͺZ?I5eSiAs]$G56:Z[~JRYpwwpס1(֡S1Ccoz~>(}64 Gb9wNVu m ^NՍkNr=$r]Yt%l=&mnN;HἘ@SVֵt+^htkC,ZSDHۗn7t+ uJQG hxt&݊+?86)&5י.G ;eJ.5D#u:ߛ9YUN0ٰf9ܚ=^rk_R;F 9Ev눭?jPtX8fvjAB^v%#FZTwtWȩG= f=|$>uK@%e=^7{'i-(a<}vZ~ y`}oJN OAE"Tc.[(؏U%sBACz:Gbp;EE6N:yMLH&&j)?ڟ c =1bBP>(gS93!q:)D0K512'|[U0":[$6YΊ`r%4+fWʫ@&РEұ1@-,S;DYM @w=հHD$: l 0c 6t8ؘBٕd0fA8i@^!f3Y)J T! UG'T@'L6\ihdAsK^\BX* _=iy:|* {X$ lTj"7 DŽ +ՃLi-Nt՘8gTٴ6p>^-MXrZ P'a"8- V<5ZH>}{Cb)b54}Mܿ )"ЗLxs1b">]f 딚ܼ2uE(ikXeSpABȥ1ۀ- uhkBwc؍źe75\A{pH &YLPVcyy2?3‘cu*ȳ޾=U%.n)wA'Nr]ӓ`4GLV2JShSlyYɘJ/$i qG W7f-S|bf5"Eɲ[(X| HRiP:NyL+!IjM>>}JlC $ك{M8{_F \ABVFbuxJz&ؔQZ=5R ƨ 1}%VW1d=q #݇_[m$?iTI_|p-is'FGp FFi %0mѥ@[@\ԤFGC%ML;~]]JL$9ſgM:0<9{_kMF$NCl~bpk6l>ަHU]kmEKѳ_ )ItNr C&jlő$m% %Oj԰-ku_$ך`k_n"rԹbǻċM^ U7nJɛnaY9k?v xkZƻ_aJZƿϧ7"J|_݌WCZƿc,q&r#{.=BV\].}5U8"1ݗTv|h/mH*'_w+3lωʐ,eT9YFPn4˞v?VېCl!\gdy+/[ Hw?< uV4?[1OVWI6{?IA/st!$EGl 'ipr`K89%τ$?2k-ܿ@&ӱd`/|_Xj{ n<|_r66$r֔>53aV3NRKJ i)|D\n}6z!6No>,?m@N~u,i'I[ krJ[-4-=PGt秝jޝhN`QZHI'.6 cywnr|F'2ڢ2a0,FAE0#紗EG.X=bړ3 Sw1qt^'+J (\R ӏgz QaHeH۶XB8t]IK%OA_lXbeʻ-{jBjO="zW̻q;т{5pӄgpK7Q{7c\c;[MzٱflȰ{2n7|x( Tl|wݠїKߒ0@~ӣrNf̠Q!*hu5rU52pD FSY= SCXb@h@ʪ~w&)OSW%q18:ƴ'$GH`"ݨƇxZ=v֢wqrJ+{ЧK5R\PaZ+!NIBK JS0Q(J)n4z5OX>βØ{brqtC&$ YS3A=Bg9k1z[%Vr/Twnrϕ|t3}0"4Iuk $UcFoUlwޝj$-(52p?OAkPn|CtŊgMkp{utNv0ܲ7l;d :mx__#= wlKj Zjɞ=9KڊG8k.)NULDCwsuJ"ߝFp=F88w7v8eLa\&2w9g7^$в`? 1q(}SYo*^$e^m- @=m)*dR xX;o6 n=q4%ZF Gw/!/HőxNbڱYh$v(fc|499-#&ͦI{n]*ZN@ bLkfG۠TʞƎyrIk ? T XO N-sGn>5r!V }-KOFX0yD4h1LՄGeA%V=ۂ<*ia!lA6ǸsCKj)Hr;^Ɓ`y$I.0W?|{1hz/;g?Ehӯ^DDHQͳ__ p<.p W=f2~8NަߍFH:ڀH!{xZ# u6dɟyҚX$8ː{/ L${cFB? }.]_oQu/`V..>\ t>%Ϗ\#WUޙM}@]j&ES㏅+Gٻh`ї_-QHes$ApN2RjEN)@SH˥S9U |9Q|7uf?ć"QsN3$`l0~yLfG󉐔ɍ|({!YY$N8E}kIgWGT {rbu2qX"s 1? \x93x36`-Hz|oRI0! T;"._U>zUdwi! 6`T,G8^n^@֞jr^\(2{KvFZȢ)Dm -0ޞy,XKF> S._vN-+ӥS6O|/OˤmD0ٻ綍|)Ƚ/RL%L KCCQ4IIipSz@"Tb~e%-4ZdEt3ǮMv9.,ЛʫLjſƮ-bptVтF3 :G#h@Xd2 ,~nV;2LqowתgoN,eMR>n2qWF moK6=[I#8! i:}TaH8,&<^ޤNhFxF%V;̃i/nӌ%{ϛoc 4w[Uhn6Gw7PFLmh ] VBpĠT+1vw7Ш a]9VMIGdLOߕrti57jRT]rC޿mj7\+Xru*e 3ㆷ?MUղ^D!cս1fTS._m߹;VxPq@p 5'BD_n3Lg:X䩬"0 > jCA}T[Ad&UQ)iU`Q gX+VJiJI dkJ #w^$M,8Ao܂|'(I3.x)}hGFR>{N۵@ DL'տZ̫?޾nW]+C:O#&㢢zX&* =>JovCOt4,3pY%ă =jB'\vK]tA5pÒ'̵쀔'Y^TBp+̪(GVWYPeRj 'Dvu,D$ dI!ܬLD8-ȨVh)ֺYJ DnZ @h)5هFg5;)+P..U5&XnTuCH"{ `;W K' ̱t0q̪`;ٖ/K^/qD SR+HDe[I1ΫꙓMO뗿RkR'{5%]DG~U G/3z`߭ ֞t}A _3_ĝXp{̪L^.3 ɅNIH:hPXՕm29[G2(Cw\hq1K6Ionh!8% ƮZ1_5/T2lj/uU `KFJQB(2šD i #؍ݗ!5 lP@0=sKDC1+2״巟+ @S(0ˉ&T ̐I1D$0,)1E1_6/_Fg^xD\y› vvn3H\Ah4䅚.ӉSq ~ 3?;ǫѲ8[1!y"Fk9"8='.C}GEB7դQӾkp7<aU38Hw[6.&xP l0j?Y};nA|Te[!ƫ 9.F0w%V:q̰Q㣱 (LQ= 1Ynʓ:;O]P1v`1c ݹRDb9 L ,JP&iqK9<l,B3EqM]vV9xc Bͩ!xW$3GR@)_~3f{{dvC7F18cf}y׎qv' =ڬ؎A7KEC5ɐTnDǮLV,&N3`@Hexhհ l49sAsrKajTMZHeBita~޳-XOS8FsoUF ;aa(O@!H\}",sN Ƣy&mF j> q$TQ7vKkN9rïG֏!#&r9+fIq?IS%4 P<%f,1޻;0nEIV[.1y+F  .fu>ۗ`׆V vߝOaRT [1G[Ͱl#}x/7'i1sNr P8"d@oB뇇רe<5ܟ5ld.kO 㽽ruq5dk9a`DSl!g.Q()2nvy  ?!Zɮ.4sB N-4h]%]}3h{ W=NtbCҹ9s"1׼T /V/ jtPtSyt L&‰:. I퉆AbS(}%$Ɲf%KE1ȟ-Aoٗ\goA+WQ.)Юي`g5^,P&cm9s4{J&"((<Z?m^Zx,y C)}c׆$z2lOjtيƒ`f7zVJnolZݧG14M0R<mL @B uDB#W9w :o?ߣ!$\\(M0Picmm |_/6!-{$%_I c_ۃADpp㒢 S@s,3/{6&00rWH(HjLHy"-BeXFNJgDf񝮪Qt:RVj|H[1"9xIxz\M<~JuCQߊBv1:Vqae_<sHo,w/rނ<]䨛WaH)`8Ш ?5pZ]2ʈg4WjZ۟?$'H5B BW:#=T{WpM; Ľ+ż"'*ŇGr")$iT*)AҽYÞ8DW53r#[d{97y t /4ڹLOEo6:݁tl:[8̠Q9`zZ5oǞF.p^$! aL)P" ʑ e0x~:b^pn(`ш<>/lzMst+XB̢yT3TKHJE3| )yh6sF-*an $o5B.]kY#:P*/6+[wj0|-֣2I\ؗZa~@oINH$io'Tٜ2)B!X!Q1vwqwDDA*~t'4aiU*+Q0*W%"ggE>y..ihH39D?_%_1hŁ P&]C@I.c9,so^9̡D][SEty *hw\6$?yvt0["Po:^%0 ޺F o暅c x0! ߀hLoS쩆UCd k .@S$*)T>M{EG:YI_ h;$G!7)KoB!o:p]A-w' \͗F=eP?cp+P|ٌ@NWKs)0fkA iΉ2'48/aLZ --Nڎ'w٧C\ B8ՐjUZ6HdDH6B~9Y$yd}`w̿?` JGq PE|&zJ@ $Zjl!ƚUqb! -ʧ*v>V(G;)orήw&Q)x"yj8Ǥ7EgeU0ы+2* +@?#+?^n=R!ApaI@E*|(߯zHJLj3>,.$6k~U]]U]y:"4k|ex/WV bTX̷ ze?} DpWչ7:n/݃1F|cԈ-LӪ/ч3hZ/Qj, S(x1CR5ށb>8V,k,s{+ZU%Qx!֍[&ҊPvKd~s aJfpq_̱9X/7-v.aҌ*r%Ppsy+O*nQdHZf':R|FQ\ w|77/Zco~{C(bȉQiMM1*A_=ly,>y28%@R\ᔧF3fV*ěI0Cy炥2HLck7ֳwi i @U0YLeгiL\ЩD(´\&2쑁_%AAa(P =j«V#PA4JB*"Vj's gVJ)0 +`_gCo#|5:5s-8QK0e1!0|&pVb8ކStK_(Y*|ee}=DisIvZ` k'qp/WsCցڢ>x~k@x4&N2$MVԓH˅FT Rb)` 3yĴ4.0=ʨ7< F<3J|Jxʩ4x}AAn5*)&R90"Y?jB\ ,7T辦X8ekLvط9 ew?'?K޾Y~ dx=G?2w0II?܏'~7 §8u30?wF φC FR'ݹ;F68h\s?}qCaOukV0-NW$5J?T,|=RTm RjHuYMjDՐ W{6׵vL_"2{}Dm-aek!&X'aQ`4( \إx"hf5KM)ig߈畁l^_Zғ/xm9=@D^qx֖L͓`?q_ZhatRTûecOg'19ىeKR,7N-y.iȣ|L~2}v ;s_32be.5=L܎'a~ K&[y[.zҠJy)\HIê26\AGooF;`)*?C9. KFExUp"2:OӘ-t_uO~2Q7Ҷ9Zk,dڰpm@̓07<2Gb:9-B-??8Os|x!Ky$B9;ŴQ?gq^x# e.Gp䲉*1WRۅC׆uU QcmjF:]m,"i|ӆx'PPnn:eX`W!YRQS4T"j/瀏%:I8m51:0$J Fx{ gd=RM%f2FOgn5,9%6ӬmVsAo϶ JPJ1%@ZjVy=TsJ V2=~ݟLUR8OᧃvL9uб='q\ICW!~7N0l;d-x#O <~<ޯ~mrnåыF/hoFưiK[?b lhҁ;gV/;_2}uW݋p]{szoob$(ss䋁ןSe3RVb%_TYca'[1rĚq&?RfŖxM@IvNզ]>*<4`$IxKq+[ Ǽ9E@L;3^`y DI`3yg,aX*X!l N5JPGY]R:b<(IEp8 zl8Cѓ~fzt'0T}WtJ4u I.o->%ڢ9w٫RTiB 9{ -a-`FKUKQ08LMqj ęWc%k⌴:5ELB^1J]E-jJnҥ+fؘ2tH?ԕư&V "wo,[Cu_IH"Dth=`S #8nyB^{Kۭ޽NH6T$=&E뫚X]s<ꒂ9%e+Y,7|d I}JSvgi*JaIbh;$Ž#Oijwa?nZKl<1]zPd)l7:A\2]K5o5䱸%ZD *ѵh-;")Uċv^v|?񤾓ر d.֤.`Sɸ^= G^3.'o*$?NKu[qcӧg? \ykE7DR2mI^=EHҋcKFqXG W7:נ%GѰ%lho ѴcygҜ'`!}z$M=¨5'JHV(j˛0MSXœ ߷_'&~X%O$<;j,wt<;H"~| cnN/iJ^joxfHfS.-LMG1o:u] odz 5o>{77%!abF%R7snrYL*W{h TD)c I5T0^"p܍Sk`AW2CW٫WFv0xsqvF?.eWIQvՇ~JzɍӨoX(n}Hi+zGR2͉j|Wsߣ,rd9%|fj^% mF^BSͧTy򊵴uk``ZdT$Oqi+9!c(dj0.rqC(FbTjNR[4 |d0&6Ж4&(B}UZ}^be-L(G S; ̄^޶ځAF,dtOhvm#z906{=y5iHõ]h^N3;1bsqq,|xlRCSɏc[o3~v86V֥(}I)}ob+ŀ`,-e^߻ݞ]&@`6 ȃOv(K%Ij~b"Ni6` Reoo|> (,|4S >feG 0>-ث<&10K&O8ɍ WV?7Iʰ:sq"8&oTxÜò9LSڎϦ1x1r9O4Lj;§Y]\Aφ&L!.BT!DmYeI7`Dbcn[WSޯ9aM1fݽ]ˉ}:-'+JrD?_ͦ-wc;05qOew&spM]"69L0iklw?f5=Zt!0E 9滜kB&BkZifSgZy6M6clHN\bf]"fS "BiP^QZɡ3hDsD8c ϦQʄYZjȖ*ʈ@/X{IF̆s2t.wÁkHv{p;f{Z`2j1x^Zr8=& -N/֜~}ZK,5,1wbkrWg&8KQR+UkZvVV:n5)8o$/dMhę?|G?@FYN$4̓%⠎6+?hgnwV\8^n\vGwA/GFRû6aF_vBh;AKpMLR*`%PGTLQ OuChG`(FXcvVw޴ $[Au\8@ uþgwRYY؅ ﰉ;"G!6<͇@ wJ۴-L)rl#lb!:9.σ8LU >57YGH -{(c^aش8K!B sY IyTdOP0KWn,'cpPD3xB t$fFO-+Vܫűɪr_S7r3fMo)ۯE, $'%zRr'b9;H=mqA&nw|LIM}ߓƘۘeb@ ^L ٽ4Uudr?L tDSG#w:.O(%V_Z gLEKɛ4}†($6 Y G"޷H:d , N?I['SH^I!*2W C˕,$ҕՍ]DDޥL[*4F&b(`5!+[Rr$PX\ fuPe9j;*nzꌏ:0ٷj&[!6 4㕣bpB!C!Nh@AQizJ0.n_ ߒ%4T sw8좴<œM{w4xc)J07/8@T wCm'EJ¦m=uPZVmY?A~`4*a^*sYqvr=KɦM.^-9(`!pďPxҠ,g8]5=OC\ e6Tج^PXGVfff o60 cajbonΎIzGR+FkoT|:9xfY:sgt b6͛ʈ<[ Rd ulb' ɦU߆Nzx'-S%)aS+Neni7 9VO/?^{ÆE]Oz&,@@70מuwyЀ&9ѨE @FT\9@â:Iʤm_q\ 9z~9DRx 1#nq=!B a,]"z4$7&Izlhgdì~sҐ KTґf0ZuJ2m4V]Ml2(1}:!C-%<ބB,-0w4J?K*&k̑l<ɈEGH,r"$IZaXhg b-eU Ɯh"z y/|sCiGYXM 0YNO/ 8?pWNI|t<y J p[Z h`#_pc*Y˲:GVdح&PqB ls֐={D-0fgL:i᱐a*@ŸYTA%OZΥ`Vy FM}:Mʾ"W,,lx&d6683M0 wm _e @S$EE>$$mU_ '4\$\-0 |.^- .Iu-htF^aBiIbe<fI?pA%C%V|_ +:FV9f$4> D1ؔKp&˵ٮ& d2Zamj@d04]qm}O 1,dC+o߳p4%޶$[akdS$l,0`5$x n LQ>5e?3 \pd P0 p sc  +Z -ZNc DY%U5 z[jx+Y/yMO8A-K@:/m!80,K[&EI?,UqanNaZ$ tˠN'^)5a57f}Y€3b%yeNz`w?%Ry>af{u'!QR$!pZn/oos}9Nq# \ki6@!pqa&}UQ:Cڑ>xٛFa[u—5l5}1,hc]ǻZN ,vMLhz h=8[$Ah7(1a@)3u Un=c- f \ʂqfk(0nHMۄuQc4]I~8ĿUӂ 7:ua>j=Xaح-ejm,4^/ƺPHyf)o6NKCj 2A(@ "eGVҝlL*`1IT/?y$2OoK$${`"r\ZV1Q 0" JD83W)-: UrB؃6vHX}iXk>E``ԦkV 6:XM SRh99O \̋A_tiۡA0XꮨCb9+=%'^%[ @y! /̬[-&h0Qi!z!xD; 龜Hb6]6*AV>0ɷ }~iAIAV~R,ɥ s^+B)D*jn| gdC@@fC}S %՗nu%˃ ``|KZnQ GQ?a]qV ׋yAﮑ4U7~ rGIǸ+ұdTa?sg '^zP|n_DgrՓ4ƣ @%>:RnL5?ej%5fxu:buNcim橴%5[R]l+|iwQoiFQl~wm oDƫW+ʨ.W?*|e_`Lnݫ2ǵ&lq)Suw;b?!,Td%"w^;vw[,SRn}Bߛ7Y${4y9dQ( x!r,Lڰ̛NzK-Qk`uJQD'<7{2wU`y{ΣMn>_C1!Z3uZ9rc/I"O5@Q eOn0 P IxvF?.{IQ$(9L.>{Wp6|uՏKI>,Bu̸waٻ6$p(/bg#v.86cҊT'~35CjH 5W=US2Bt>TЃyv럟s}'&Nǩpׯwm6]>%h]nM{9A߮ uAA7PTeNz2H TUPVY2;>-Y*tC%3&:n7]}_NE:]N/@$8+!s$Gg#8R>}0J?;'t\_F ٝD`[óH.Bt:0) ˈ)X%L %wRB v 8G\ 5ÕiYy4Լ$VWt jG\*vOo|OR&O>YMRrKeW'-rn8O bxV9e!rf@U&w6$IQĉU'!y5#Q9h8]Ƙn{\ȐLSʒN{&Rux{Z-9HV#',z'JdΑ۵GlyLQ'z.6F/ϫfEHr.mV: 6WU˪eUŲe-Yylͼ^髋 k6q-e-=bWln,bfӫQI RnI]jR:o@'%jْh YDմp~ 6iiτҧrF emǸy!}O I}+n< K")xt淑{(!M[~v- !q4Q?N<hnOԃl ;}|"UnQ:S0d"zLbe"ȄSwc-痞z0u6s #i% %63ʺ#؝9l=|A/wiht0f7U}VF*D|YIQ`6>1 ~|~U}>YRʈ-RK VQ;ɧ;ْvޛ|ޞ6AWZ3[RK֩=ȴؙ~;-%m~o$*%Nghe }Y%auv,:-UpEg'~Yv}}FJahh&+`4s]Ud<DP+=yN:J ASH$:-t0b譱oΎ^rHݸZ+ *˽֐*ۅ}3 *U?I:Ɯ,LhIWӽ)^Lm}+Mu[مs$E)9Ca#&2i*9le6kԑHɨ32!cYnclUIT֠f] p`.ޙ,V.h2A*Đ,D{/_0ŀyo: :~X^hYaa({ٿkvawa^ٯ)Dͽ5MQXK\^٘Sx٪H*DdE#vI 6r$_ob:f=X0o;cY%+J~A 9:)= -˻ :rIr0Q%޺-W[x.&f4L'h0M1B<݀Ő%NLN.C/Q`4ixz!-bH*R WYEŒvd䂡 倠(.JboA%/u %f*sJdȡE<&#cf9'%88rn>nٟFIKj/K ӳ)atb͊W_\XB%fdW'hz|{^?Ώ(8A5}T~$o”T /ߏJwQo7n<vٔS_Ft~!pF摘 K^t\J30b| % roAOY.𠤥Lk49~KFG扼|bZYe(%beqY@WB:b>"FEEGq9AM BDA )Tzj7]'r }/{Xi*/a)Q0I,A'l=+-b$ eG6Vm[4֢\At[JQtˍ4KEw3R ɴ 8jJ[}5m=v|pqm\7R:>Z6ǽFם[NOGqBlU^YnDAN`C1s9HBCd"* T T"JʛL3="%Ϯ%1f_V[Faޫ(,FSA饎:,&m+cxMz4knˠ;obZ( ݁aNO(CR̖[J{u{.j {d[~?jRXR}NhŒe6LqJ3ckZF" v6gu*ToXMsbXCsNk.fJ좚0eQƢzYFV"\r'i\FF9|P~ŕX[qظRYAq9jԛ1J!Pޢ:͑3Y)SZ04L\8ZKiu)تY/SyTY\e!X ٽwo1 lg `6- +#L 9,HKy!5 r"qV(N 5]zI!I,;*&Fj.Cgzo͕ )*eB%re0BE,e6S5WB*W^8z@ >Gbee UV|FO02rAw0U׾L ;vU&PAJ}#-Mǁ3`D&%V(좙SC;iМY-=̴ޥ-ч`M2A Y>}vAJmՐ9Qj}*Q JPoaW UR;!e o.ue#uT3pJp)3B,,#lLL&ދg>[;pf[d}tsw6+l3&$ rMFXGDY[VhrDJ2=ijEGu,rE0Jhy/FZrtE,X xJSo;kD=? z}-(ۊi xBJTB-Jl]%5eV jP,nǑL6eƚXCF[`9uume%+cy]CˍMzrDju:yiͥ^e;O˼N;F9>Lۇv jh︣UC&JԘMRn6sg;zK<㪍)U&T?_̿wnɔ'.^ޓ57 K]J _OJ _ꌔTX*+`易i7p1ΎZ 4K&%5/+\Pvh#G,k(IAl/G;ޜR mڒ V޻x ok-?Ngښ6_ae6;TL\qlh@beOn &RXAR6Ҏ]eI~",˝,-Ӆ.[ xr ɬ ,MqX˹YBrXƢ+\UHM7#/Lɱb5fgzN_=mrAnFNFcV;49^g$0>5?ix@mVHjk{- eK_%^"8:ШL3z<|'=l7OFe͛&9-EV:h=t׍_OnN_};_]Mcll!V P<*+u8D >Ϳ^I)[7X'vJvk]ZgX[լiBm)f_l@h2F-S֌64|Cݦ8^=qgJ[X.o(vzGw7pK<׹, #^ykAIw5rû`]F!nك^Jdص9 ۷ր29Crl FҺi^Y͛h;nqW;?zOw; ۢhpF:a1 ` LPFp({!ڞt~?269ӭ Ns1ڷ# n7#TYll q Ms Jwk O28Ó|̈́3 w25iM;dN&R?~>}VOhTfT*4 H&L3d&SWcB1gq ZA:lxQ[ף@[h<_|;q~"y#y'jFدGT杨Cܲ979A7]FI W!qiWU7s~LE.K{?/88_0 I%ci~J"R yʱ e1HJFTt"5`'tr <1(,/ %xN7R7]zHM%Sp%$Tn!.>=zM021*D֚Ҁ!44IE,L`mjV&lVB8L6RC5c 6NDPE_^+ 㽚=oyT$lP/3W <ce m4w>cqca7}H4o6iroNM_3-aމ䱧Q~/J)B,^0hJ|@#s LI [2 wB`pA~8m.77 T61onHd.:?KQ6{=Va^َiLhqNz!0I:[lA䎱7^RQ$&"zd[l3|54B.|z) {o_чFխCV7nM )!8e!W(I]ɖE\xZ{:g턝=LW[w!W'd#ktF"*"H5Pבx!<Хex` =VNrP3d3()Mv\Pbc zEfEL;ѱ 9T춁T%=GNd֮~yGɉ6Η"lSsd=7.8Uc7_|?I5<=fw˟mWZܽzfo&ͪ7XfնG1.ȗD(J⿚MKSv2F“8ԄCsA?X)͝Silk(!͝.DPlyUC< M/V5պAϙ C?i5~ B ?ΰQU|0e98QIjSD:ҷ_ v'c $!zf?@iblw,o~:ˆyr9?!+o@?ܬ?~>_{3'Sy _2#~6_VOxz| oߙLq _SL6RbͷGeטQEF"kKɍ )F7brooN4I#fiZxIN GJ(9U$O_p6!@%79@fBOCH(&#@IFAm.c1<JRn:i3M8é6+!%RJtFlzfQ4Z0РYTPN-0(:]ks {Eÿ}HHLTkJQbMRօvҞv.bo]0 Z:ATYlT H`X { a5ؾcvx2Cbr<)C}N΃5Qk $xpoz>{D)l&@`"3+Bh,]7:<4dL&kIr'YF:M`BQ$%i,<CA UD' f$D&q$R2iebu(ibz H0ļ&"FYaWY- c Tbx12o|F`Fpe 9c0T415A Jg1 )&8U43. q$@J̶h<_|;;ɻw^Gz5D'󌃲q{f`^KAP"d(Lf1+`P-F('&~,c-%d4Zº^ D^vw [>Qkڛ^b帓XA'iR֝YݝHP,§ (Xȇ.tcn%(e]GH@`!T5#(e06oT@ 5f+\V0Q(+MlGH(>jrCi6Kme%]DTܫ7*@LF *[[H()"ulVN(M?ݗC-ߥ-?̭R`n@- з sR佈p+> 0XkZ4؝ri-"LݦhX(m dfpPJTvEwR J5=@K۱Gw;,34Q*8h:ܸY͆Wp(f J"[놕K~m.ψI}ٓ!0-R6^z]#!GL@L &&1Ib>\(,yH3T\) CpDPї9~LI1#od (26\y) }!='\&a hML&Am/\ʷվg:p"6޶8'Dj`Ɋ=+'SG|00=}e2)KqfBp"TL)*oelp1\tA^о&p&oO UA{VƩVG0F(:0fm-vq0k3E~dkrR,gz.5_'-G=+M[ R!M`9ߺu</Db.xS/64(HDσqߎa֗M<-:-ak1Dk^#[LŪQK&/"n1TNÉ. mhpǮ@ZH¾\YJ[^闫T<iF ]wڜ bٛ|GJnh_K"q4K^"hըOfl;A4C\N#G̏>y~Y<,~A?+>k(ezHЁ,H; LPx/QI0ހ `-ƒ@3 ]|}m'!HS]~3wc3h K|dN?uǝ*wK-d,۲Q uuF]qIs$|+Vڂxs[zimL8s&[v4Gh%V|'XQ>">#'᠟ +2"S 1@ NҞCAbJ16eҢT &[G$tZ޺R?K{-5ea=9Չu;3*`z7?Z%}4J6UTʀ<"Mni?酱#,&5N96 \I|b% $si;*-BUZC(Qoa IN/zxp}2%rKE%*wʓ[=M<ŸF~z{(椼Io|܊M.ɿZQ\6{ZѿZĿ~+ܞE=IY%%t N A׿^GG#xʭm6\ybzoY+ ʭmP mnЎ_I ֋7ޛѨ{ޏμEM@`ܰ$ylPlN5ԨF] T<(\ůقƲoi"5! >v\z} e]ߴ~PlTvJrbyqElܽHWNL+0#7J^/Ō{M)?VR#?1mzͰdQ{MϞ(ś%}+{$um3C~)xqIHz"[&Upǥ0L  |:M8/IeW\0\$,Q^*r,pc1e 靇m8OըW`BiZ`Ո)\6u_=L]fs" hS0uL5vFafjl aR BJ]}9&5nr5WXc)t)B"u09PCܫpsLԮRb-ab5R WNm̐(p1\*r\8AcSѾ9kYKXQYFRxuH|i)"1WzL 77m1opx^ض&78̦X~^G~t\Ȓ>)dG3o-M;V¨5zfCE݅*P!i<)H4%F }^|1IIsDx7C5%=1ӍDyJD>(R\>E)RQko>cwOAjյE9ޙX jm&>ua. {+LmlW Afm|YIS1O_sv-oo'^`8zhԤ} zHrSy ( O0NGЄ93H {q j>@x"3Kv"$6R߄מP=[6Vi!0e=܃\RؗvՋurJ>Ni]w3LJ2=+^sXڐ#L:\/$.5 ;(>bzzX>2.>v 7Qˑ[\ Wjq9OuG)Dt(W7!'$N!-A+ƒKkB8,j冢G/v_`BxjCXئop$ai.3[;o;gwh'v|K+n,q0iD QԽ=bI1|GZ|^0%H]^zڧ^/^Y ;`(veɷ<0j_I9x\ ʓʙ*wŒoyO?90 tq UهNk/W'v+h/4A)ASKaqz󥇧cQ7|7Ivnq&sx&dY|#G% >XZa` |0yD[uA+eZrH0Hi9 ,O\1CMD D@Z 48`B PV)q.Cx%#NȀf)%"`f .U[Քxͭ98P-v ШsnRA| 1i6>FDZf 7Tae( $8kop\(Ls6W< cV -ObВdT"xT~CּpBO^no:/jM%sVQQϡN#G y>vtY;+NZ %f-Xic 8JP[8e2 A`kC{|w9mb?/Ъw0ĒO}A@83w|Oq'ʝo,kˎ.|` M7]2)ev8(H!@)*qx*Т8 &WziQzҝ(DwJhIPh mTQ*ftaaeILd(J!0JP>`vz3pa޴78#a:o92I܍3OBnZQiwyvoch|r|'Ώ,|ѻ6la;LGWfxfG3EvT&08.DI$FG=Emsv?׏N p+'|^)1#$9VSJƫ,Rb(ђ[ǕUd\;g-gEW~L7 /RƳW}|Η KM{0S t0Gף9۶鞞 1ًk(23|gjYIvaP)&|[rwrɸg*1pX ǟ\:Mw+OsCp/o^zܺ,Y#n;|… (`淠A*v88ajljvntgo'5o/_*~ȹ||gķvxky;A\w39IO9/Db6֖@ںgooM",d4>&)ioc{o{qvd?÷@} (_ \zsbN/ ^At6Dx=;n?}Ӳw@} eo't'7aʫ ^R,N   wvӱhf4]:}NJUS US >u9 ,`ٍN4f 琀~IM>n#xiXlz>?܌&O`^%Z??X=>yNypK>[!~^MK'ဟZ1%%9ԘK2eK=mDKx?^t˅K ݳ.uj8#r 瑢P7֝ mօ6x RG,^nu8K&ԒCΒ,ٜgf{xYRhY8 x3pdc#f#}8rPmӜyyD%u-gU XǴz^C)BQv#B"e '20 G@ҀQ0fDZ{aed>f4sI U2M1qi+Q!*)юKKEZK+ FQ0.NXEa!a=G9)5, 8[e`yЫ)Bpia鄣R 8M"`GA(Ic RGrIF.F'↤@[K{6e&6nqKٻk2K!{/d҂bsoT|.cqL;Qة5 }֝ ]'JBzS8{i7Y m&Oi̭!=5mg:o}(tr;)#-%oQrI(\MZf>EDU&JiD?fuESW(+`h2D ^[o8@[X;ߎMµ9%ru~_^jp?L7NëK':g5Ɣ$ ]ӽ0Zk/=x2 \: \a ]9.q6I, {( 9ȦUʼnOTBh (nQ}sWyx +Ms,O ;ggчbSx/^ :aPH/]oCDӗP-}?Ҕ1DʒXJ)"+i^ {bOQ!@? cZuf+JGzp@QPD~oF䓱mWbJJJ8Si9G%sTM8sU>ZkfqCx5"Up4oZaHS'0Ea Td7Oڅ,izpph2&Vj*)qkI )NNq1Fl_Xkmֈ8/ ڰVY j$D,II툣U܊ u;1ncd,6Z}ޖZ织mbl@j4U8: *Jc'ѩp7LJO%OdYյ/zMu) (/kѯ@"VLd!8K_XdR)n$EA /WI36ܗt4 U_fd *Iۉۤ4e~'Y՟M/:pV B]UwTuPuYU-Q7E9C02 swrqr:󼑡.uPgA]uvUg١ kKl8pS+ &bPvMƟaf0$'{H2V\)G{viLLk:X_nk+1x% %䱦A{%|!b F0Q$j MlA༭tE0O whyNJ( nFD2dtYXE+ژKD.Te6 f^ƈ9n3#<IK pn5'LxgXXʿjcJb#it(<":z1L Ⱥ:BFa-InѼ)M.1LQc'Xv y*ÛA޳LWǵ$;wG?\;y:3jK[{6,EHQs#6u4^ #\(A`Hy r$24rR6HF5nV%1EQJv2`*Rz|!o#0FstC;B|cU<La gCMV4Y >a1W ʓecNLᬸuVkѯ?rׁ' 3 gº_Ʃ,yԄ>xSv|d0Ztf3 OV gkvbW0 4?^oMC X/޿N4?Cأ ɰ"]V43aMAh@qq晻ilGY'#=p,<4QmO(͟LʺT 8- @">Mm0G埾^Z9OF7`s+r~> %Hpmvj~ZB5iT5_|;1kmNAtdg<7~ҏ"“~|N !v%c>tG]&醂]V1""V2Ie˼!I*{U X^xEJ~RJ֟9 3Z&)m[YCE^11`VT(E J",S G[,- S3Θl %(!Et3G KS vK1t Z S](AI%HWISzD 8qEq1d'%Z?H"VHa /P'Ljg[6u)6E6L&ydC2c@:FU  J{1b"jǜ*e6}\N&D˪u_ty\0:`6NB: _4_!w_6?HyIִI3B0͸ꪉl(g()J6yvXeL2Z] z0~~;]E7"U3B6*ky |kII2I{BxFZo!-EQ[N~ m;͋#gɹr4uϾ?h&Pm g ٓrOBC등MOjgnȤ[,I$>} "#"CN(.2v%c׏lfzeOIʱ-2mqGZKPv6|tmǮM`$aG+Zp켌 ѹrVвRkJGt% 彐L]5Y:bŒ^ 2(cNgӑZ1mГ:V'c3TRWq&B"A QmTA+Fhy_dh֏61RS{4:%$(a^#b|1\858w5Dz+YШgkp>>9_{m5}yN?>);.M*qd(:a Idl3[xkμQ *mf[ŀ$ʮvHKa]%Yk-Zޕ6n$EЗ}}~xff0Ne30fVFL߷:LIQ9AbKM:F*8yl[bR*4A PIL8Q:. 0.CZjr G^A‚ERi20 @bCFciIĵV!FHBRTqGR(D"\i-W?UZp( L,ӞKW)yIFF4^A3CD<:n!XKc )bmƤ10^˹.Z%dIՖSrA&8! zLҍc s3;1R+F85_@u^dRm!t̻W?\2=NHU(_ؘ4)lo@>:#%FF4/Zj\1T+TWPAISq3fOpd| S/x  c/\td[xi/._z!.aցMnPtո8o[KrU [I& \Vo0?w1S{oaD#aKogOD'do |>9b󁭨7[WL/gO >s:?{=tqEEJejA ( 7TiT7*ׄ^ڙ'Xƺ’1hGﮌ>,o,x*]5|!j{(ISk/͛7gAimuh#M <ɐK*R ̖MVTh:?ؿQ6ZMIbgCzOސHs *} -N_WKߚA؇YqǷpP~@!P?Щ2XQr+4AEYQHv|(VTsRmjdNY/..z^Ar핓 K E"_C̊H8:YOsrg&$/ۄHu=[d(I9z.%'ANr"qТ`5]*ԧ֥[Qw?O{5 Bs発JOԎDJ]W e^p?k̿_uKa.y32VϭciNb"%R|?&M2A)bV/6@/3 8Nϻ) E#Y{yA\Y";t33B*j(յɵЬWM6i.ؙU&?VU8:=FZ{ 7 :tLu߼]ɥ{uY7%Y';1;dUBwW(Dx%ѩ^=z1V!lwu[NOR~FKqEERz ʗٓ{ mi*Kt̓='[hi:f: <$/)L&!E+/+ݐ^n&bJputs6ǓkM͛ 27(zaxӷ%^į=%'Um0$'!qF& VX "w60m`CO5q;.]䃢ϘQ^<q8%Is"(e}@oFntϺ.-y i2Ą5^g61օ_Dټ;;Ir<r$B2Eod7UWnE1ȣ8)SiUkav+CBq_4O):WyAJI{ s,*_Zπ?{(E' 7q>;Ey]3#P&K$BJ:SBGz)R\1,O9R(0W0dGK,:* T:[,/mҪ%1E˦fu1wf4˥^,< ^pL*>9 2K2at8x8D^[ŊmjgSB҈> t{͖VXWɌf/y[1W惧LHIwl5Do֫ ZNvSB[jOPtIÁ}Qwh{ĝ 0s|vbE۱EAR"RbEG:|ç`F-_.`QXb,2%&FR#ZNzSRfY-ȇFY3|0sUHF"GRI(@fm))bjQ3qL#pɝ ^dsNRqP>B!bj) ҈ 耩`D#eQmა+f †nLbǥ LPJ@XXTEVژqbc!$( 1@xE QEU$t$,  #㐄aF Bcw`D PZQ%H0ÙkfNE&f#nB160o$Q@$Zc^H"'1.!5%3%@P+m 1T-pgd'֥ %aއ`n[tQxCn~qMhY+rv};5`rd~ћmO?lhNܠm?r\W.b]| ĕls>M8fTP,Fۯ : ;׿SBUJupIE5)a|>7mП#'Xr)UīdP]d\u(|碓պy sMѬ %Zk%h݁f-xL/L2VXڱ: NAIG`E0Fʽû""q 6y( C*b7׭sp_(`#+S`4={ ^PTo{ , 0"eCsjPL`LlL!j7'.pq6X;X.@R]R\䖍^0E@0ipz>jbRuVxb'IV"FA! \Xj RXй J(T Y3,8F ͋khQ eY ZJȤ@IFH4RM!zwwK1ӂ+q?V;i$Ub~[W/fݔR7_$TK{S BzhJ*Qi{N{EHCeB?EBJ1w4tw/^"iy;M.PK;Zp4hD;MacmD~) kv}nK^.HUR&+>Hbǧoy$^ #axS81<75 Z Hs^Y ^'|qM '=N3WD|B: idA5H:6#tZ'Բ%>*$2Kr%RM //`jX8ViS^"eJ`kZCdZ)(9e6#IeN"ܱ(BR[*THXeW+3؍fp>r/^lyƱCcXKA0B6O_eRa}0Y]mϟf8y^~؄ :_Ko߮7}#$8K8bq* 4r[8ƔAwRIL&e'& \#5 ]^IaڟL3u柵[əxG}$9-;P]>Do"]G,d10ɣwgvw.J`|>9(iH:C*(&L'Bn]ljh=jތFwxu.X^ӛwWo^}]~[[)ۋ ^)Qsv.`3GՖ? W9橺-fL!l'޺LSV*;=њ'qO7o?`mdt[1kqF0?n~[{1$&챼iNLcZo~2cxFTwȪ"Y`?# bTc4a(NNB1>2)>wEļ<#$ jZwT5l]IpJn`M9+N]Ք291 &.<֙.ϝ¯~D`H:]4 thDc)|U}q-H>@]+565\•da.2Mweק-Ed<|dxx_./W+"o-赦Z&tQhK8X%y D}wٺ̠: ~1en\5"FZaCV٠K$%DSʉO5Lh]L)$seŰ"1fS)7f,,D{S5c+ᙘ7)Qu|QSZq钃9.Xs&珌 ܛL-@Y L8=.CtktOk9ɑww|Q- |wkտs:/OiJDH_Z"%{č! ec[f~.g@m ݆l9ih_xtp^Xmy5>Vۦl:AQsFשRE.-0(-cektb]~ >״z@(u(0eֹ<,F(FB~MQmNfjPHـuecatL_6VZi Mj-@J:uG<ɖEhsz7J9fQ\))2\2M$ʹح%@&ࣰX3,0Y^hf) Sϋvӟn}<$;CHrU}:rR-.OvAon ]ʼn",JAgrѭVboS6>O8mg-JEƩL0.Y!- c'bO-c|\Ƿ+.a@h]$}vg)(+v,0hϋtZX*3h+pY wώU{wxL|>ܗa ]ʭ>Q4eׄAW"A[^i){ݗ+J(3cpTI b[E96vHB& XddBD&DB[݀\TF]fQ}x#)uh&q1Ndxat 1ˡsiEG#2rEb$p>፮\P:Xk4oYe&tp7u9*Fyg`5INm8AIjs (SǖXm-lL_zAA"ZV[Plh.(&"cVH҂"GZ jnXv E!(+Aew1!?3AFHiRS$t *ɥ%+ò(.(GRÙ kƪ-~HKX1O,9/+6TrSiq5.0?bZp{\*+/f*Ͻ~BTh+uFR٦2{^+#Ώi0˅%^=!BXof`һ:(^$3br81#V Șέ(Ld ^@&ٙ(z0s~50/'h3:sϒj6epɌbf]9j18&_߁rD\\XZe%U*%e WЂ:a͐AШ uja2])]A(P`:Ürfe  -J5.hVHp D{boH(cMW'QDZ uH]!7F|V0[D SAPqr=PO0.T'VO*y)y.̗w fɯ_Fc˺MoghͫbͦehGn~L2;_]$XCa9bzXi:o˼E7 7=|=,`J'0a E_&SL p dYy[ȐREZnCyr7O08 u#ZTPlhU<~m5ai.L,ԋ$- -[oui2ZI)vt]lrSllX8.F Ur Vk ɤ!*)e 3d3K5))\c 2kL^0A hFo -9 ۮVI-B/Ĥh> ](_NSlj^ FcV_}vb, lak׃|Ӻ]zPhidݶK=z(M|)@@~S~O(bN(bX=;0:":4E^lMCiR6 )͏y諟k1tcُa~T;j*Хux :/4ߝ1}x:,&E_~^2ńd/V-4`Bێ-PV2Zn@PxQb>6/Xrb.bl <+w Ijr 0gT6oՂңH'>߬`<[[Owk?>>CNι8sGA)mXqEw>#PsLuFWiO{w\3Q5 ǽ0>pЂOӧIh%gE!n2n{JA/= Bu/4Jܚy֮ɷ~gޖ4 }=py`b(óDPڋѩg+OZ_(_Y:VN $BwuֲEC+a-ňuUQNEfuϿWw1j^[^FGS: qFx{]zo l:@C*p[1wTըu "}%ipc6YejgM#yi$If8("JeMqۑuw^뽤D^ۻ;v{" J0a}/Jf{ vI8RBa{x( <!45fWKlJ;6kź$ Φ9LKtjaDNQv۵Nخ-3GJ1G`g>$Zۍ^k(PIAR\rQzLyc}gڶR fYZ{=P0@͕'ǗMKww{3/NX,^ PW<#,(_50$z .z$Ȭn T GT:;8Q2K (HpJӺD:MӜr;`[at>|a5PLPoF`pLݻf=z&<<Izt` aU ) ?s]Ӫ3®r,' 8!`NώoΡۏ?~Gwgi_IRm$,z ^*/EPteF]L@И%۝ x< cb.c D>ِt1*F\O0$ӥQt-m7,CѸr.cصa@c]tT,a\HѭU ^ʅe*\z)1=({u-a6MXm/5- NJBY}"ޮ{[knj?`ISX45t9Ku4~ +ѺI3.P鑊H`JZ`t#˩"-jA e,VcӟwKl(q7lW\$DtQf({1Y[>GV}Gva[fl8Zs/}>Vě*\E[= ; Ȝ"1ޣz@YԴoJD1.%e{̐ءMO mzF8jxr ^ Nl-H]q){ [=cy4Ȟ>A@6w|e1}  2%| E/\~wN~!TAߧMh2tF{ah~ DW݁tl)8Pnª wBS6]=XHx5:c7vdmpMWC2C$I{K&׏?5j6z475J4 ̓A*pWpFt"&)\`,Ԃ๋.7kUZLK3|\=̊LoE:y}ӃۖO>+w4Y^iQ񳧱O:,*o[|XV8kx=◧FX<5z7H{b>ŋry@Wr_+N~~D8aPz_d?P(*oYe@DWr?{^%}Bn?s ۃx"[T]Dc@!R2FMcP$/8~MYOisNpX"F"!$Eev1{1̆)Kj{ۑfHSʍR/fgg{yBbD K-~iǗ]yÙunn?Lz=y^]?qz~C;qLyJ?~^O?Nz2w'>D?} gyǹYݦ݆wrb /?͖וǜ}3u6|h ΂&4(G0sL9Tt׾iO(sL ķ[ev&0|.=|t6Vbhy(gE 9e ]I}P:F\FKDl>8qC-M lI `x2NZ[̑dq?89M*Xt˃$'E5A:nnG)l…Wa3;Ļ3ysGx:>_/WooCٻ3*қ2*5TgVqb3W> ެ1k%c=.DK\(oqsY3K*fRfѨM)Q+5J RZH8)!NpKWBc,k@瞢dOE[tjt*N}9H+x<+hSm ,MRN¢ac`!6V*)6qІEQ(ZZ-'Yp!@oW{E>3ގ}som_'NW:=Qs㚗*A4Zz~]6tUp<'=YQH%%p<3/z݌fjnj}("-u%L`^kp]CЕÎVg:% }3˛)GS}X:z0r› W*>qFiϡdA`wp^0jڨ YK=HiDC'*DV0\{! /RtִO+qy6Ne-6@{!n>|v# jM8 2^k =z<6 wو`t-1nTƥ8%P] E5i=|n,0(({4o.}ePe骜i_ he5Y͵h[mc Zam[2v/G#1Dfl} nDaL ,1/5?%j *дa_/?v2:xj^R:$tb/ti@ 'Jj%^O EB P R%IK*0ƚ^̨ђ|mVdwlKy#G;:A\(mEoH9wYOk꤀`'4z^H3g:JizFB.ˈBgb]EU9e*ɩd&I8\ OKyZ.-(aMFfw6C[GۣZT&ç%Z;nƣRf_RBO4Zј ]J!CCZqTFbY(wY"TO02RuF7sdStdﳶ#yW>ram ~3/rfELg?mQmRkYRb'va`D. ttRL~y†a1˵@&?tN֧K<٘ǸZ{:3 K7KI*.k+Y)`ZWn|΋'K.suIauIZ2;cQQc[$$c< [#tfd (=B$f+CS\׽ \@¯-3W_ROU*Bkg;0.@-֫W633Z80 .*QQ~8 vā5BV) I+,(]tIAũYPNhm֒ gP}^E^E=Z7TQ%/\RFKY*r"JeE16cA:` ծu2TnBqf8lnB*7}'i8GMq E\E\ԣ. mB*XИWΐ1$zmQzsBmq_vF +I.1݋ϖXf$F0z;C E{hgܡ 6{hhFD+Inȡ5.qNW9G&O%QRf fٳ+@P4Ҭͧ n{JEVW>g|^Jo&Kmu^m ԅ׎[a0l[ [n<2•?I`f˕,Qb*ȀC*4*_}|0Sf]1F Gw}JwƖSPoz6j1 @Y@wK=Hv|̟+ɞ-??ǵoN'usozrhC۩6"o9;.~';gǥkyBb:stܴ&= G\0l}P>cCbbK5cAa383?)h,Kd~UZo[Guy aݤ% 22SR:-14Bvyg?4fIJrqt~"[W{GBTdNJmL&{@K^'Lr⨡L(1I4tkp`F=;l.]3h* vR#X^+?m[eJ)b\3Ng5A:]Qxˆp^SlbMRoN&'9Spvw7^Y Qy'4 $BHnYRASb`:qF8+ǯ)dVS0kUE4.P)g3;Fc._*F),6CHSOaAN);L!F{üBs|u[Y#Z_fǻ~bG?&{Ix{Z pqNj<%$3/Ͼ5f6_Gtw~kqVsN~ Am4=_Z"Em[l&im%%Yκ(7Y3\8YM7Foz d:[$?ygz?=~oy%Œ!ƣt:fx=ЭI Kvӟon]~beV$ Lv/ {G%e[Tm-$U 7s1pg4l's68d2#i;k<05MlŔwBqC攲Kxwf(ogO+LsqA/7Sygğ2c_.xrgu|8 ף}+xS ? LeMuh:~l;]o1 /na_O`i1+e~,< +7 >_yDz"ZHOhyE% y"%SD8.ԧ=˃щv` u]n ڐg.2%eJ@g-ݫ8r(XM%JRs%eQQc[2BiWZV=LrA\>bH8qKBc1cPXԻ OG ꎆY*R1#R̟©b/*YuUAUW9bt}NLP]u>Y\/S1d^޻M~p \^/).'+&T$*83qՆS`6-))~xfoKL]ւ퓔]!$Wh pdj5[Le^`tfr{ ՇrtASZp.9n9{/)eƟ@PҺ p6lq - I]Ul֜xI)CrN;e\|77wIK"v_Z,'uŇn'IР]_?`FQȯq$*+o=x&sY8᪊[xbC@:=6eh7eX<;dՔRYx6 )Ez[{7J́7@HbG-򎂖 {0`.[}x0#njQ/8+}Oa ""a ;lYrR`N1-w,c,`9f7qv%j2f.X*1X+h1Yd9arnW[Cmʆx$ՌDKV4:3ӤFԑ"U8 e8!Ch a(+4 fPas{%8TxXNFB!к?D 8{_d􏨔ݿntn o:|}R_nt;{?޳l(@߫z=~N^^]ˡ_rAMr_.ǎR2 r,a,ilwO[3)sn9F*:cLRwpJGO¨s K"2a̰DIcigfb40gwfaLHF4OR\L5Xൣ:R7@QnUb Ƣg5,-b5(Wbw# !H˒(%6<1sBQH {<&1X`T3hp|"W;W$L=-pH?TL"Տ mv_˛'~עeW+W..Y~\w E4y$) }sDcK e)X9UƉP(\*9V++;˛E2N4:׻]8]u]c6Xѳ,lIETL,ҽbT,c0 r_ 3aaQ(g*V4H81TN$vvQe1 ß IPgp2Ks*0iڱ aئ ̦$Ռq#`u!$ѳ]Ɇ{T&{7+ASq9>K%0KN L++ EDb'Gb& 2M˝9I{i{?U,A8R~}Φѕ~f^^zOnGƓor\ByR1F3Rd7E2cղʵ ??CIc;,\H{|3 Zh%(\*L8Ø0I ϰR'WO\n&a#eKba2BSDžIQ3n5p*XQm[Ɍ֠M ;zfog\ mgFk10]k8?3AR`ihN H TjV'4ԥ>YngG#`D8Fy x*8z`E9Vo=CsoKgy7T,ZHɻ@r(pJGXth{+wk, S1kz_:-/=z(^u<>Cօmyaz+Mjζ,{^uGIJ{tXoo__*~y >| &z|A})¾⍛+ntN"[gADu('BA8j 9Ĺ63k󐛘SO&9}$Bޜn)vek)X(P)-AR1:4sq:T(A ɫ V4K:IUF<(P;lM,^, yOܧspT4ÃIyn#$H_Y`T0.g9avk%Ogia{ ,=qc2d1ܝNhAjB{Oczȯd W "6=wޅb-#v1s)օgy>/%w6[TӽoXdھV\iw#q"ɡߋۜsF4$FO }=yOA+uW^$ç>}NRd0?U "^T_IotjPz=yq1@=#tFϾ3.99mI\@][ &VUşum7-ep)ɐm-yw+=hP ~JKsV~f[?sug8ł\'KA*5IpZ_xu1QځDk~8;Sd \?SS6^lVMEOX [@iߺT0YN&O;?_܃==Ň_5l*LA5󝃧}٥ϼj~5"'N)qF9a:g6)g>TZsbqYgTZŻ6̓6jo܊>~b040gw1 30&Rba4$s4LK T\3~Ŀ?Wh4&/~z1=cbtr/ x|}o+|]N.oHB#Gьa&PRX! Wco߯F F~y]\(vW7ۈ! kv7h19wA?^μ̢oҢBsOV@sՅ,&0+C,_$HSl'z}~^v!VQ-wn/a_;XTϜ|3s|2| LVLw;c =^|gRot5j椮QCA0{wѪ\#ҥxwBS=S̪@qn_GzsV ZHRJyd?11#@@Rj!6[/WVp1h"4l^nش̴W*j6|n/s%< l3Asɿ #&,}2G^\yY\}s>aju̩Ol&ǒM|~D.6[U_*T49dhx }$c5l-NVl;M*1%]6MMaGk@$L=jI&X'ut!$5w`xprR}EJp=n.R2[Ths7z,lً)'|<wE?<>˛—7İz>TƝo}ٷd{LVd{LV%};ve\Yeos0mOH14VypD!1g' 8k1+.. 1J fu с $!(AQE'J6EoA2Pb іvN;tW^hE*&ÝBii(<IC5g%B9cNp+ xLKzÅ."SѣBQ² "&,rU1:O@!B`Th.&!A7!=Ϳj4ktuϼ\wu\El/~f,<,ϖ2/X?}t!.?O<}~^``4s{sYzYy;T,MgGdz?!_-~L'>Lx}& TXQu4'|ᦊ:ȎAҵ/\ro5Cr3*7Cr3 U.^Q;)L+x/:!@=ECNL *(Ӗ,:;Gpɬ XvSܐ/Fӟ̧ i&-W67l~6>tU%aK)k>fz;_ls+$d:?+Ak:^ěeDo2 p_[lr5 5]N1Ӟuԫ+Fbr3 u^,؞˶${gyЉ/;KO3*Vز/[ x7VXXDd0KTz0+ Po2t~byZym+*}2ŶJj~h ( k|uJ#jv0!Z*~džjC;#r6E~=9 *Rj`%%G,&j\{.F2iVwX4p[d9=JT&PqcIhpI$S6Z0PpM\Wi6EUp2^ը:P2Z5Ucm_Zh6l|"Z {CNOɅ#J3Cmbܫ`sClv#ٚj$Y 5OB]lpz0[nQ'Yqga9g1>9r:Mo9:cg^dY.Z|3*<[l>L.䩬nr8䙲ːwy9,kuuȗ琉_/!:-g)J9 w5V. z#N2u9=jNgn[ALJn_T!!G.d6m `j ڭ+%m虧zwڭj>$ELI*F^nZYQuŠDt>vcyjδ[Bևn/SH ;# DWQ 5tjCz)]!6 E`3 _Z䔀/)aZv?C O~V's&UkY=>/ BҽN@*2JKg" }96 {g>oRia9 "Hk Nqc@j-催E\sebĵ(9C$_/ f1 ]%pSCC;H vTk bЁ7bI$R̡mt:zhU-)sh2:杌"cgH-`ȶ%>R Z#!r>0&S4xVr7&0c(}=G]ВTHC Z3Y\17U?_BkPАm暮×2vSFӬ9£W׾E7\yKayE(/$t!-g L:P7srPAj*)QTufdtFX%gQ{wNNWD T WQQ%ۂ 괇Ό@ A-lfd;Q8l7e:+2t,Vt!;677}c !Cd3(%9Eޏ,$S(q=}ˍ].(00NN7؈bճa$^ϿekP0iJT $ڂSTZ^;D9ьʡ:B H& ZPtvcg?Uw}5f\llPMg:q\Y@"Sj8t5%3IdяF{vuo}xffo뭂Q54q1WZ|% y]Mb̢K?KkEy^] Q; y!*"QvN9C׳܅8  `;2ՙ*% [], T4#뺾y_73$ GRW"Y,6 sƈ#ֺWesnnGDPhy:aNz[㲻*t8Ïmc/?dYΟ&FsNs~}vSh9^(I!* AV@ E}JM4 _{cv}|wO;N=`ؠF],z;#v,fuWNoq4/<~WuJ8cSgC;T;c!'K𩧳Nf0*wkdž]Blm-Qku}l7 zv x: zC({ R4&>0`C T|2q>hJ8Z,7"+$6-ۃxndpnSy"R2J =SWbɺgi ~;Ҋf㚪'L9͢jeͧȎ;{5]/N0P3N }3skgki0ZSžC9߂nIN\( {e&;`@*]=mbVZ`%X l;7FnD]&(]T`VfUzR {LzȂX/wjElD F( 0A6EK$ݵMr6pJ&>j i S5>˳H]OvDI>NΩ/$#/(#n;@s]ya.c3߾ &@Xto,?㻳-?Чa  RDed+Qs%%},b]G8zePW 4[ v'w?kIGŎ9R=)c4IzQkAÑ]cC zʀqd9q$"-r/K{\*M&V)5TK7PC¡Ȩ\tO Wc9^CrIWIA5#f6Gv2`DsB:ӝ'$ȵX){aQȁ8VwǎXG I)0fi̛=1fO<-RMqtW ab`KmKNm{nO@Cr?N||>QBÊ0H1'߂Tν&NOrGL:I&@WtdW8@e{)j"F%n5PO@n71c;b~:*_<`j\I#+qkl*|'N2cQ)?6Bc*y*}?B9Bi-N(n*0``U,dR:fdiJFFtzSJI%$w(~J{_zChLeb8ٓ,Ĝ0fp 6y]|_X bpI8^2%!%!+b\:kR3ـz K,~Y2'L($ZPn o8s<1Μ`14X2̑a\d@2]skcwq9nvlpw],!BZNSCr_9%x2b(X.ùepZ IQ&:;6bcaj*Ow b^T+1pV343%!W +ͨxpTxf4@BE-jwGDy9:Ci_jӟ 8<ѝ&A.y}e"1w%w )fZIG!? m&`8h0ߡs5YŎ#BW BB\yED)SL;G!܁Zmqj %:/^Z%p[gpyNJrЭG^K#JD%T{" e6 ;zN{̠0JYUkj&{z s=vkPCzï_.$-@5jq]w;p]{<"&ck׋jn&Px"]qr~Q7"E_;[#IWR(mpU(:@k%r۠(WRct— p,5Ht0IE[e3i(-0RoYUzǙ oweƠgұ:\.s%;;k}*jk \㺦s7^Is ` Ngz5|o p4.lgs>jl쪏~*X*fz9G~2m{᭝I})qk=Jy m C!*dԄtiB%IH1@1=mٽ{4t:7`kz}6T#䂝N Uh LQ/v9kv?x𙿌f'YsyV-[ ;MߝAxutrjgEuUޒ,njDl4 SazZSʒY;dtP^ot8r[v6%Sx\!iM!-;Z*h Ŕ"d;?& ₐNTS˜Q]\&AL!O-!`9gz8_!c~0,glvcaE*o3<Ꙟ0QOW]up4;]}N%UWܡpJ-ڞjӱ:{8+A 8T̮٫(g9ՒةXErg*B%83?g~ ċsAP7ZN܆V X@Ĺ{3)p! >PrF&EDDL >>zo2EXd> 'TIm4֧^4Vjm z~boWt"V(T崈x{9zzξx~"5n4죓\ s}#h1rb֐_4$^y>b=G1hq{bhir 7s<?gF]9dLqv`'xiL[DjKFGwZ2[xkNf'q -aߵAhnڻn_}f/w-a߶ [1?ʭsk a;ʁ(wΦ\Ch_3D>1ںhª;*hOw.vd *ʡuYg8Ov=HXHl2ׇĿC(K휃%4tBVⴍLDea؋or6MgInc;ywa?^%{_qi3cXCkƼ1#I 4q8la߾:r ^:l@T !u6t *cb|:=V5+F&UU*kM^ǯFeߙU=-'^䫶X|/h(r3[3x2S#xCI4yvކģ.G3oSorn*ܝ`V\Tf(А|"%S\#쥒vV/.*ڭCH"tsn[򍋨L!>[*ڍ+G拁v됧hi[g[9j&$Q/۵C{P >}G֧ny ڭ ELQAhE8gE; bHdsmw~OXX΋M%lYڢioC@:˫yyNÇ#`W]obɴl';[-\q=;jͩSQ͒o}2z{eb0}+;bBVZrk7;sdN䗰ϊt@nq{OK)I!$&`9W2{^f2ˠb77k|.nX(GOSL6?j Sd!`,db9۷ ;ߚWW?lg"y SGh*`jbMTD}(Ag+bR$+BN4g`Mm^Pd;(w'#YPD0fBS,b<͟j[=82E*F5s8իoLܬ]'H'E]6 ?ȶhB*MSBbq #LM$)!Li48 $UBI!S|&BʘDF %BUR(NY(Z-lZ"N qa")!H C!MRfѐ 1ǔ4R- y^h%BsBTMEN81P2L"D$NQ|uD^37LZ'0F-;r"X,@~AX;՜~$x4"$ (ր݄U N"*xӼ>Boc7 my>̑Hv lUNVU!6?d >Bw>W]Vc]VvmwXT*% .}r֏vSa x% xXBRٴU:ӈk.b7H)5*[Ț1Lv*9aj_3.$}㷍yUHj~8$Q(*LkgCŃoonO 1 AGHhJbjy3%8{82I˧HK͸ƻ?vzX 8q_?ULnrg[k=Tii=@ĭk֝nNFɧFR}kKSG{x%'=K.X>FZI:~LDs?98}C:zH @]nӪcr(rԻ5<>Ng$,/'4熭͝ݰӝ󲜌ݰLokQ6rQcD@jkM58 ⥳\>$3[kƛL:VQhXgjи;S5p6Ux˷"Q<QzGD|øwD0Cfx9R0eXR$&BE4$D$U R"cRK*({CFv]ȿU`QEZ()_w~X *({B"+%VQR Cs=U#|N!6sS2)QD#^+0@_D)1F ACJ8Jg{'H6,%wd%YkMV i$A i$% @I*h S"HGPTR_!6hr?< h }N4_:ul&ӣdY8^M?H$9ߢwhNBL9o4櫟9DYb@]^ZoFxV׻e~J)oۿ=>kh)j4 wxxr<`r "0#Rcf+LZNz,3$}+Oh J!&GA *![<"zQbL^4T,xsH`\6HXB:T9mKэH0_#,"9=5PQ\GSRvZi#D3dL8^>L}y(jkJd#w); l?>걫ސM՛nUi'wїfUͯiqmV:注q34,y9 yAX_7ȑqUUb#Tcp*4SŪ\F8V1 ɻE'iO7[٪U5>)>藲 f)勭kkhTD5"M1(jOS PF /TwK+#񽌌 B|47DmdIs.~cONxyǠ!bxIs}fLG 5uC.M{p]Y(`w]5sbWt,'Įj#2E@bWĮJ,QW$vI;X$UrR@/thOb*8qtD^yjƷX~;ZV&YÑG;wN<[ z@'yAB) DBUJbQI,0AF"ݓ!'^!o(e8H)jѦmipc]``.+!RUGKu-4,m-u31HvOܝN yǀ!璢 lP 'L+dnkmcmROE%\a'7>WZ )/@V[XVȠl.F|)CW2$ *M(j]qCZ z=0kaz*wDNm^Vzs>Vę+/ <uH^5nlOx 3%< 4%J_xOGC -W ˵Eu7x?Lojx^+/ܵTD}" t^L8}Nf݇v{,ر~H؎^ #CK^lBKcq1g'_Y%X*$fQ$ P$) KGޱ( pyI!vǜv':+PDIUw#3J· E)Y5颣Ns0'. ɍƀ)_,%' b}><\>5ơ iZuӧG9ͮ6:7xO}9K!k3T4-[EĆ܂Q.y!z1jkER͝7}-oFym5B<) CaF Ic)B! TH Rc>G|5 y:n\ݎV {FXJ~8~ym0ZOOt9noP;K⎸kLq$B )h% ٻ6ndWXz9'd{sR'[qn `qCQ:"e?.;sᐢر84n4 .r&) jО+@3yv_͖_r1.DqҪe辋4 (`?,x0Q6d6v; B eY&֔D!qOsih& KDv囷 o jز=gjL os(|Y[6Fa zվ 3(nNL`F9 n\ @#0dX.(9x@b u ?gr~YJR_UhN}w4oԆ3{8+yL,>;,E >0pޠ헖'0J1b-3!L=ǒhiT "6ZT"LcEoOP ۊf+%5:A% [@7K!޴sì Xm;\`p^RzBZsV_UVƃ Œ$3R kG)[aUϹeR. vIbLsX TʉO⺰j%"qx߲_+̈́a} 5hbȴ$S7dU#XҎ~jL;R\MEO$˔ce ,YlRpdlS5Nq,0Z$,5:\^د*rĜ5c(}ξLh[r'pLhGR/Zz*}K}&ok-3;N)W79W6Z~@T5.~R8sr$k2~=ȯD֥ezw'WQb v)hݽ;J;%#yQ#]XeLJ bkd*%JJ"D# hN"+}Fn͐g>IĆ5cCX0]tX-q@T&w`\u>in#rt'xy=]asMݖWu7 \>LmQ;R9'廈F7זY@"yD?@6SiX| Tk~&!+'((S9P ͔Q(CQ\tm MSҨ)[Qk`6oiG#,-FX+AG,ާ>{e!?o[k1xLoJ.c7+czV*¹@v I'̑ڈ)"$np#L BEcb*~s~31גHqAdƜ`ZfhP v  \#BJ[,r6ޜ* U>B{~XaWaP+O֚>Mw~ U`BX3A(9g FE”BzGGeq4q1ZnaԵxrc&ߙJh,jo"傗.hdQҊ.Av$Y1L8<9owQ;,>ʈbޕ_s0B.;(%!C{B=Te 5!BҊJHbs^SgX+"Fsd/@uYHF"gjԯ@^%M!Ju~Q޴8j#f.¼\æHCӹꯞ'kƉV14L%ܘ7qYlԜ1Q癶Kj%Âfo} r-+P' ~䳛m#/?p0ޟ_Ac5ޒ0Wľۨ %8 ~֤M59|C|W8OKϚ1Mvix<!PZhdc)H{]ˀS\tW? J]x5w%f+nBdp'WUTWt;`^uOTWNbĜ^Fi/dn[+RNg*QڐYQ*8ٚNdcBrVST^t: CS4za~?e@o Ob:kjA wgɠ2`14H͍ Yu#Z ![a g>SrσGAI$IZOm/1u@`hM(ey1x[TEi}bqjYi1`zK~Nü5Wj/*yqmmo8,/č䔋s'^@`4 640Lp\p^Q,#X7 b,ޑ8Ssu31Bn1B-URvFhuGP+}Gh׸FeUJ[| 2Cd,piI.`a†dVrKRv2hT|r#uĦ5c($+17pyŘsv"6's*wD urP:QJ"P+P(D֯y8P1LX5+Ȋ:܏$>oCWTsvxύM'Qcz7ٓ=[`LrKٳ?{J2IW1*&\ cJ@*nBKeq/Nf{k Rk\>~4M,o)?Un8l,05LڣYЂ-5crжu3({_8_7s3jT_kj z:+4M>CK]c]q+ЖkhU{p4ٕ>bn]A>G"Sk0'o~ 4Yλ{^t#)3\|x8vyx53>~0{fO&[,(aGƷyc5;e?YL{&;Z}ys}z"˩n]4ቬ;j26q9yu-qt}->eOoblrZz{-56߉w6 mA :{\{ i@4:qO\Nח\˗\(nQ@ƢTCx!UnSIwHjESs,*6DE,|Z1aWcOPka.Æ$J{9FyN,c(1Dsɘ| 'AjxoRVh#92P8 hR*L^R'f=b/,u&P>% vBۼpt:OZXxXH栘) 59B05` ʕ&n%T =]Fcjz<#(y,/ Uu#<]񧟟^qI $<YI Od<'+7 __?=C~2}ćH4 QLk Qq.6(t`2bD g `SPcM=  R*T ś~g.4]P6bFuoF0[R$~uoff|w}P~o?у6 ?o{ &^1i|;+|8KéZZf(|cJ6VN `6%FbCCslh|V*oᅰA9[fZ(_YKR3O}}ϭ%CfтS=ܬN%B%ZL.(HyrbY aRL"ގ)&w>>8؝^_bߌoLoޝ' {alaN \,-rCsNŪȲxR^I3Jg>m&nKi/6_-hM*/yoAs瓸f3BIh_%sD@OW0rs=B[Cg\1RuLZ>.NF\klWolF{ꞌyya틏ɂ2ַGj؂Ȏ.8٦/KR֛fujjlGnC}FG'!t/RKGeɷ'wuıZDymC2{rSj񄶔t}FN$US&e T d+:ѤRDÍm34ˍ̹.+*)!s)#ys#(f i~mMRCTfTTmy2>=WT4rtKu^XƫVTP:Vr!zl5a> ~,!"=F2\9G^<;2T|*9AY,L@'Es˳JPL2MͱQV RU|~$e 8/(&^ʂ\USك-JsiI#RnAĞ_Ud6OGKv|Ӽ^s^*tM:{HDR"r\v9fFEeJ$(KnHM4N5 T,`MvU}}L f0%$}rRЬUULڙrnTXᢶV&h) 2;E!H:&_Bİ 0~ H BmIHD2!16uBsAFQ:hZkݧW^c1mE::u(-R_sㅄvol?w׋o^u3c zpp`ύTzN A+Bxi`^(d$4R#!z5Hgv4)4`^'\qQ3hdIEU4d@-b\bô±sZ==&1][.iR5~{hp-P)mpMHA%(Sg) 5ʕa쌚<h 8)CfBF 91hm=&wWbϩ!ǥStF5 F[7D Nx刣,J46 7] V7w4 (`ճDe K:&z62őiSᛘ4qk\by4((S1 ] &.wIEgo7B1Cm^{C1gft{"{:1`*i1p^bo4hiPrNt=PB {Ӥ{9ݯ ݇czM L+wiǢ@ϐE&;M `S@!$Cbb {@GldG3]=mEv&5fhPTCX ;)Hg̕Q"nWFS(3Ò6ZVx;*1!w MdSh$-wcef2*H/cly#7tWY.=s@f8! l84Vt; 9&F纀XΟHkr> r(BK<p"tIJ%C ȏvlOO%DLE?X1NOKU3dn#OcqD?B©=e>&8ҿIb5$KG׊6Di9qJ* z&:2N->R Za uWa Ӻv(NA}#ՀyE6gEh ~^h ^ZBkhRBԱNSFj)yc`B+59tBpAT h"oڑZOt7ޮtY+ڠG]\?֍u-%>XOxm\T;|&ZcSѳ[-dbFp=[}G=d~>CX7n56,w&J[K 19|U1& %d߭b CZuKWC$#P3SPP3zӏyg/ Wp{Z*ơ[-弾]Ż[{X~U"0c-m?!oYYu;N!Ӛɛ5xB'oa:LIaA ,t ^?ߡ}`Zu9i=IWf@< S3pq:20 w:(!0:_,i^z_q&Lit\633xh߮GdD|/}{CFŕ_i>RokFUKsBwi(9=!L0ྫྷ}A׃󲽸NPP#S[ʡup2KHDIXZ+ծ n 1%+x>-%XLceoҬ4a#,kN+Pz?GX`,w DJhoV~?J\>NNI806# 6k@O&T]+L;16rS&MB[H_Vs˽HLA̸4UϊX∑,U|a=&26"֣h(57Z&܋BD@ 6/' mEC z+8>6-%G#iLKߦ@Wy2/qf.bi \ۛxA%8a/fo_'ѯ!T+1 $`&jUMJ= A0(󖴕dAvhS?$19C9b4xhlTâ1G N9gR(w`a eLl+1Ŕ!=Of}B+A'E$^ ׄ )УSA4c%-5,;EDvP>!JRq(-IQOv=T$Q: 0jz(iVj1I3gWA3MF&$9t{uΚxxqoΗZgyg(*߫5YǧV`2/&p! wrPz\KrݬK}57n3 UnqFzsY21 1A^Ĝmc9fl>}OOqOO\VCkP!Z;!#i޽볓kKtǰǟU Hw>6w I̔ru(IRtp2IsB2p4dè38Imj+C#(  _| _sc2/Z8Z$F Lz+wF3G/ !rKKbCf<.@f! m5Y(|w_b@mÿzw7_?7F+%^?t0_r~?ixLgV_|y_;f[KZ &؏+ࠔ4yx4TN3d3ϸc祭}ۓaj*,Tu5nL6ZW?š4z MLt׌H T8Np40<}hKeJ ȄT"8*%3{w`ʠkΥl2WԪ(ϸ ZxŬ68E8h6q5d&0&m'Z>U N9*(*bxRr:E2R5iI²RKbh/Sp#XR60MP s4A9O *"3sVfmR0{gSTI*.G X*1eaFtBq)'5*I صukdR"@\< iU\HMȠ@e1QotiL"O03d |vzߜďd8Wwh|giksM'Bډ?vB׀y⽻J :V@hWXKK_4}s TA&LA̛nϚ{ ?+o['gplmU A`;BЩP-\ jA!N Qԗl iu=[45肨6&t\ 4I3JFHf~|aUsOGRR-P1jڢY" j45G%ӓ՞)j3s%PexG`[{}]^{_mj[{bF]z$/Ѯ h;l@x]-P%-=2;w:EU[B RikK` NW[f$Ll{|Tp ?A1{@ 1rNN'7ux} 2#4DqS)鴻 e.H:)0즅+}e(2QyyQL΋. .j$5RY8㬵.6+S *tT6zmV;]1תGW18|O`OQ D*K$lJ%SDflKqD+e㧫缠+{¸.dl阹V]l~qY*[g|Z&^~0Oq,ooߣ?J Jk;N!b_-u+eT(DCSȄw akbjvj,JEO>_0\)}qR(>ҙ!wj!9B2Hhp>ѡ'>A j6^9JW2aRFrʜ}^JN6v+&ܴ̚﷌Կ^f_?˟;݇\:{Z< |}wf,Bd y[jNl&noӏOy<569盭ʷUakW qm OW'%w; `)7z w7];/V?pvwӵϕhTbX9R\ 1%RQsS*mm.)9DQ'sxwNr>W_mIƻbuI|ᜁ9̩_ى\[KVKhYzht}enZjBi G2s%|&d:wc̻3,{q&Є*ݎѪ4 ,ؗh'A(fC/ LvaTwzBFEN[ M WA5ǡE ClGeoE0K7*֗?/wuk@AKqtIZÀ=}:\P,cnT$%Q҂2[cg$ɴ-ܪT_/l1[]w$t X#{ gj{ɑ_%9.ŗ AvqsA\0`ؐ=ْ%lZg,>U,Soi}I%֗dZ_~_^LSVD%UJ13P ^Y$IDŽ!8/R&O_||}ϯkWkCf)v1C\^SIP_uExwn|oUhnJLW0g?ܴ ϶JxHf "Օi#3Yt4}5+nfhL#7>Lm i4~nT/%UDȁ+Q (7PXtA i|y74yi/5π߮ YƱ(D:6 $sD:,$ljJDБHGf}9fŅovP;Ie-QoQL)DbzWA6)Bhrl'КJ{-D㭌CR>:MyPֺdR kEqJ1R۱ЎٲDc (B KN^H8Iatjrᡪ-ѕ&j#`Ρi WB{ItxBfT!ZOv!]iѾ߾,eT?0e](}߮&J5_:=ArIEPo!6&>\f!?̪bۺI3߹_`sz n'{.~{ό@Fv_W }+J nEDG(Yt-j8n'm3<vL݈@/_P)8'iBiP=aMM cbjB!/Kn[ǜ , "@&L$fBEFY9t`'J40YX)Pq]FV`%mA񢲊vSzB\6* Kn-u\:cWJmebL@.(mb$[x\\biob_SЯwČ{ʧ63ԩd0L^|9zh~1r Li•lŠ5{=k25Zz{u9QB #Pk5)cuO̍4fPu\A N o Z1|B嶲jɎDָV|xSIV&!e噲Sa4'_%LJ/d[ԓH"JW?.Fˋ}[f-k[E4 T WrOE)8+;kqmL|u$Y qQ7ۂ4c t 2jN+IQʊ+JQ:Yugڤf(A[[p F\09Sh&[ǟGA.h%t2x'3> T/ ] C2G%ďBR"T̤=#dobk}8LJH~I'QN%"p_k v׷W~Ž0[3w'e5:m<>밨*ta d3<׬O-E꙯<)%W5,\N{7n7UQ8m8Ꭻ6Zg_ɞlgfЮ}ڐMBоx-#Nʠi +ôYh9Ӆ"E\dCEbA`Κ5J;G/NR݅rn\ )T (9BwڣTBCwgl&hjkzσ4%Y1Rы1jt8Fqfc>0[_eR "ȖZح`H-UalO*Dh>TodF,wZ`2ߊK͈FY9E`80]}}Ō:#Ui;gE .؄gT)UUڀAZD1ηgbfdaUO L&Ļ'I"M^])-.st!gOoOt o=2ѥYU2f:1LN y_B,$|q"n;kN!;JS  )yd, LYdɁWւ7ăPwDMfYN#=Od w9?ܗ4g M|p߮RvmM}Kic:ܷۇqߦcU#sG5AT ½>E-`$kk>d-ۡʤu83/F=x $w.D6_n6j6:$Z46VyS ^Rs0ja-oCҕ7եF9l% 3࿄nͲ5=r(RhBiqNF* w:N2۸(T da+}K9&vJ^nz>=94# ei%1hNg:vz@#6z#V?~QrvqꉫCQLxz ^4ۘ#f ߧH uq:x76S'z oߝ`dէaX C~oՁF:'-H/+#XW*} 6̮7ԍ3"./O~lYȊ;[ݡL;jg{)+[D$'qBNeuLKpW ]i?S%`ea؄`Hߕy⨑PPy]#T|˛eةIK'_`l҇e.Q^,ŐUC(pE{Zc9j{Jn?87wQ b&${Uۏ;fJ ϣo!23kI7hT e3.ɶ?c,Ἢd ?oVSԬʾl!Y=!xALC"i2-SNp`P0aLH< D9,F~4HAތޛŵ9݌J nu+6MK;o-?tPb)b&{3fWuenq2cv7t.߳6j3h:Źkt?sܔv·!N_u߂vACXw 9Ao4;3^<;VCãt j@W~ [Lюfbt3Od3n{%ٌud3]پL#mq54mmu][]D[So^; ^,4ۇ {7j1B1hkT!ĔI^罫lqR"n_'ڞ}O*AĐű? v@*?Y h,s)84t3e7SNo& k"lŅ 4?~k .6q mV8 &IJuA_p@OX2ʬ#q4:R{I k 2;cUB $ڽK ,QO RTY=9Z YK2߂?˖`z巗KP ^HߓɾU9, :iG|wWhHJ EҔ~tMyо'Wצ2b>b^vcR:ݣ3(dB)[vnQvӼ#&㮫wð.Pf|)Bc)7T0*:`{71VTvh"}MWf?E~u`'x '+X3 QVҠMlM_GOpbL>ijC.OV0 qt?Q;IW_o z}{-ԉW|tEG]٬w +}w0S>LߋCFTШ^$ׁQ} I1J5<|-˴]V mĄ>O ڌq-YܝDpl9]xY$ vE&qx-^>82u6X~VoD]|]v}mc:^}y-$UGK2sM7ij NS{8M!Cf,Ìd&BL\F"A~NINUBI i" ɜvUԷf+'LP/.ԅI=Fh1)m@Bz]&: 5ͩ>Z6E,8Iǿb:(8EˀQGSoFwj&`",!zy=^DZ0fcpSr$Je٘LSpX7qeVQeٍ-%iq L5WNjd_u1^ƁngNTώK҅s32;-.6*qY.]]#/žc_VX.QVib5҃]Ku3[0wfۯY=VLjcYk1N!+2W L*)keBLJ I&%$N!dU(, $<2\Pc$jw`4AKa^F% 3N!v]Q{7TFQtk4#O1kH2LTlpڶ]>Zzs}4טd,˰E)BQz`W H:,"#) 'WWX<%\yh\ʼnJ8M`rK[rD$s)R(ԙYR*XWlP*~fFvY){7k}6Xo*0׬!|c!Q(%f2r\g2^~63wf}!+$\W=\VdBV/#,Mr`!.#*3R1A mN$D $C,1"{W)UEJCGKq-"l96BќV4Y}gtRծ6k@J7+CSA?o! >`4hn+{#@*~3># | t27X ńNL bA>{tvBxRՄ &gvO^%ku:*tptD!$8(1*Lg*YD D!O~2w^#3IMe_ԡGp{ Ն6z.4i&#;n|4/e[k/wI1Z¯~B -:D oLN)8Jz^@w9Rę ;ߓZY5@ELiND?s6_<܏D9kmej񘬯ۚ,Vߚ s,)>G?ycn߭#=S·nLC,~r5b7Y.[ɦ "zg~~6̝{/MlT$@LTUzER*r)' \Tb+Cpqg>ŠٵrCX+76E#5"Evv2W>u3a)Ԓ(y|Z; Nuf5EtjJ-#>_#˭HGo? l\}YB֒0v 0KQ ⫭Y>Q`^ŲgHXh!~ q]J{2eIP~5MKܞX< |*Clϲ]]_jBĻ*[̣bI)%rR̻so QA'3dI-B~8)p#+xJ[>}*F8cG@ q?DY3U= p»AC~R̚} f!CFJwZ}y6m2u7-杸ACKجxOlhs,E_̴ƍedqzPT`ŕ!SihK#=^VQ{WӦ PM4DDQ_9S뭟]bs(&>1}S?Ĝ {<@(a^ ݿN-}\#$!gk|TRT aX\H'#JS2HSSP&!M!#Nhba6-^jE_HjMR+<ĦٻFnkh{~v"-vMm`N,T4~I3G<&5w9$]lQ-(AAH}1B8X с%hSH+(OXdW D&Q`;őct*=Q ؎Z`;r)9ŽvPNBbd#仍Ip[D}E4J8®vLAb":ctn#9|[ y"zLaF)Tڍ{D^IӶ&|[-X߲FMl1 ,dtnϋ)o2G2nP`L 5+m7;H^2cͥIⱘT}Uq1|aTs.řjᵊL+L9-X3DKZ=%BS2o'30ľ)`qLیm: x=Al[MH9x}ĮCc2W* [rG[f '"]yfijȁ9lF'p0>#@PܒIBOؿkڨS YROFy`L4Qx a@A˲ܸ0lY"IAeyr#_`-7NJ~Z.k CEIE/hˡrV+i2#@3чύ WAxjb"I> |J x_~-0グb2B NK4kxԹ^!nitc'|ǟA7f&;ݤNiC؄ahSp }>Y%>U^l1] }CG.] 4:yϨ/8{x{&k=5{2x Wi{E_6V7Q_}h.={r!$= cA 8=QۂX:F:h 39z\WMFԦ6 %G}UHzO*Y @zSIC3P;y: MScx>XTQDPKB`(B 5.,`h ` T;PTc[Un6|PPYCaCsJ7[RL04PkFsQRkȨ(HNs3[ֳV-X"Xl9G2[unq@1[K V"#B6 FǴ.owGKL: YPy'/>*'oJ>~~.><~|;[~کwu}eeS#?g8w_ap)g-gVYw|K'6'oϮUAdMe?džJ_nqJ^!׬^ƣ#{K^zGֽ gM d94#Lsa /̭*_1&63؁[8`; /ѪU~>N^?{~p;WiUWޜQw狹>_}e&uz5'ϝ.q.-x- es:29HZpLN%P92Gyb c ;k=>evZAM)DC- : Z9DU; B TA@qIpn#ayZ~ܿ[UZ2j*AZї. N' qB|;t.8?Cv v]p]p; PP NWO.R &4 wR}º5sS4Ǩf?E`gny{''Jey !dί r9qN!ڛߞ;]`n9#MHBB:`<ϵ;0o}*g-E nO FlȚȺTj_`*5A)5S4K㩔}RH+:S;ڍc:N]ۘvZt-z-r=\8DpOv1"-[}1"-ku{K˞[Ze*yon^}rrS ʫl ' _}rm|@c{)M6*F'>ez9q&?+ Q Ԗ,V}uٜуIs||wo8ld׈޼Tx=dzٷ6md>-~/i@̪u9(%ʭ/1 ɉ1oƷr8|")igzg[ώÿ-"FӘNXajdEabԬ]x+ά#0cОk=Uȁn?t;??+E H4'  ~zcIob㈦W`$J*' 4=z>%J:Nq*)gLS砅5/堅-MN^!E)&ǹ㜱V0M96Ba0WX\cPX4d; 0$2d f} 1(vJ Z AnK9H;Ŧ۳Uevt"eWpGdEpggYi,n~Dͯ7qΏsͳ2͇|gS> a(2%{z|Z=38-T{60/0mxC,`nxPO-OGp6; ާ ]7~4v= =9þ\9Cԥٺa9ڟ)@= 9/a|x])zv1 \ t%+M|j&/CӅфjt,d YI'9yn zO- Sc l8CRE!\(lά%$,PSG*S QqyB/Z" M[JZ#ѝ fѵ` ϱIX8Q3ٟi-iƍE継Ook_ULe^<c&gFra.ϑr)L!L17PiM[iRd)7kkԸ>--܃s`',D p&SsUJ`r /{qR΍Bq/߽4e>ɳSֵAV2D.OA6@L_$]S [Ƙ06@n"Ґ~wxm*?\_:mחNA]:t4ŀNc{4;L.p$+HeT$8yߛH8X}9`QɽxR;^1!U#8;5sx.&g^n!8[EK"`rGéeDv A!KCM_]PMkfbҒ(v.]F4Nbs`׬E4FXpkGA'0щv#q< I:Kⷼ-r)XK=M?+ |tT'p Wd|Co>|b"RKA4j~rO+ ׽uHGK/yb/uU?|P]ޏC3j3gQF1Qэfk֓.ˁx!cƀﲂa2_t? C>_< R_oTUUsSgfLj%%g2dk92M1& %,֬wf}h }~TA~>z3moΟEXCgWJB:}aVgP*qϔ$[Ӈo##ٸ?ڻy?Y|s_c34Τ:0`Ps%ᆼLC8Xw |t$!Q _2E xH5&P4$hoS߄hohrEPxN-cL xHus`G%;T E#ﭢW0= '#=yrF6ޟzx͞7>͈#З; s( _zp\u8$%Gq-ZQ.,]zYtYzkyFR1s9՚ Y0"Ua'!N$r}. ^6 b[nl޳+h;ϣ?Xh1&Q"D4dOHvû`JSKcc+}Zktn¶iCh`WJ %P2\:5e奬Jf$z;/ BC2ƣn . 1$|B].!nN[Õ[?\[^ם Xug-^zh xw] ]ʭǺ;p#𴸕z8 z[Gn: ~^au$v1gWK[t W{*G󊃫Y·Sk[^~gV?o;y"㣹 $A!Bs ]oΏM7Q#,Œ21pWT׆_41 FH4CvT)\ҩj>)ݏh%m%1pϰi,ӕ@eG@ /ie gx# GspSCk^km&ϸd-z=W#`*as "D S#I`D*#y&"?Q|J gEߣТ2'P1DA#6 /9 iX%D Vrmd]ʴzYJ5JTs3ܣ8fmuA8=Pm M4¦FpEat=) ΑJ_'::!8TuNGA9 b[XO7޹AnWX{%Zrޞ 2p s?Tfh%HF m ~bE~C6q~ӌAALA aq&M S0ygYr"I؏IZFHCDBd=-vҞhi ,Ő }wbxeF)()KҼZKsj>-@''fqYax=p,& REg!tMuG(BZq&͹V>S! Y0}fˮAVّkm8,|<?e<>aRL'e ܖHhCfsy/i SuzLgzfcO\g1g4h!>|V $J! (JV@夠A4*Ê[(@11JbC,-16<H f6+ 66l |l |%䇼HŦBAOIL",P2`/}[jwgW;ݹ[ B0Th H*mEH ,3 "*&*,3S,\y>(7ur1!,n:>+-BҖ+ ! bZ9/x 4edjÁ0peqne9d hV:L .jrq6"sl>Ʋ8HYX Y8H{t_mr=BLY٪?|doDT C0 [V05"b9ɮ{ G@>p&_۩Ln;drc9xtB&)́_{oJ_SNbvTrT-v ,rŤbRcƷ}N{!_jS1 2tȘXR;}&ijN"mL7hMޏf~Wu蹶{7ō-ې/?q__}UjPFdhosF۳:]ԑrR/8`a=sƱly 6MiKоI(TZpЁ&^ZDf@8cqptq0 1j|su\} 1/6557Wf1(hfѮ$?\_GMҐ;ʁcwUF)eIDv]0SrG<,{ٓz Jp>L^>/w҅Sb!`g&M4&FCŹT#+}mK\/^WQV͉Y$~=dmt$i~c惤߾Sœn?/s$Dz$&_9rS%mvDN$W6HA90Oq_ig;љ *7?Ox[V}Ŷ;h7]+b蝽ydh?ҿޯ3*)!N; lL l(DcJ?7qz''Ui 5|&eSN ۹l9\&];+1ZN%Vfp֌Xy!cfLp1=P5c MM1u䤧l$iW.p/o_+Zo>sJsT,$AreH3`R|SsF(D0x<$:8 0/?x8yz?QL/ cM8^. !f?g(ڈs7T`TYc_mG@$ rP™mzD:0P ['xd>DgD8;1L~:K@ %2e09@dgdD1]JMJe9V'C L1C@:w u{jN9r1MƠ\`rB3YC- d2/B)Y84?ZRK:'ၔAమ $ةC2482 1,F!=zs:0xxF;N b8;I 8IQ@(^< 5xI!a$Rvi1^g:((uֱ>p$'.0ۡG 7$g`,)DJ"+yQAt[8fPBVxvC#ٞ*Uy˦;<%!@bCr&d!)!,dN,>B6|&cS9z<pvtb2M$: ٩gnCXnmJbH&Slh`+;RϋݢtArO܌O/Z4hej3x+``8'i&>WsS/([$N"x_֋CcD0gCG[UCuz,X@9nb!s%V!1`Å.rx]E)"ex0gcABzWm"*h Xe(8#I_6fgP&$y±B=kN 㠤ߪH6qV4L"]Yߗudb,F ¾SZٟɮy1#ENjX&?AyP2 !q1Gp'E_0esP3=xSso.Qgnr٤6L4CZl'FulK〓:r%9lʬХL҂IP%kn\3:,&trz<*K9fsAә9J\7Wjz=U1: ׫tj|ZLΧzȯιQUF$HfZC"PA%NsuCe 3AA xyʼ65˔VԳ&B IzR +,q+&PfϕdaP7l(^AtJpSO?gNDW3a}ڐ.0F c -(cTl 'I9,AP.ƣOɂcVǦD!b p~e珮K&Kik gOF-լ55FkgOi3걨ȣ&G0J(-2Ƞ-Bn,Z8o8}|qb"M]0r [Ia6ō^G,4΁h\Vx493_;zF8&v]OfP_fx#Y#&v-.3.=j2ÄDoB^fD6p'lnAY^YZh ?*ATY /DXAx!8p JP EnA.g p| fB0b8ɔ[PR!qJV.a.>DQ3F8ut1#zfIR*j3T;j#멐8-%f0;ϴ#xC5Yr> L8VO=`v ZDb4jB D%b`A9cT0#_cn@8q&ث/( Mz7(jOe+iu/m=04x;a FD0㉆Iž p"z/$d7zTxK-Ѐ -c!Mmh Y)5@쁷X \ԫ񄽬sOm;2nS+9-Ekr%ĿzgK?$O 7(*LMz=zo. O3 iz?*l"b8>\OaM_;C5שQm" Bݻ+DχJW/jn~L~[^ XɅ$QyHzR uQ^ X/Uˆ0 -K\UqP`5-!ƺE)Z{n~jCr2[ջ~. =ȇzE_fLH02l*¹ 1'104ZFu«׼^0yEЋ[.H'e@ A@“a932C޵qD~XyLVaYO/|#wN w"aon&Oșaq/Yt3L*  ,臼{EOGrs@q|l5>#XI$pƂƅ\a+a?b8_6\A\@!hqbc/@f6)4)R (Ƞ+}%QE%:1H)]a78uju?445?5Rv7&}gEQI\==+2(gfR|ϋ$$z6} I Ţr~fDӎ6Λ( !J_} ;ZYd^,W VXD+\Xc㲨/L{E+;+QifBc .sm)%R]`G^C]zWC~xob8j8 @CW?-E  2I牨QN W٪}$=u^,ޖI/]ʝd=:ikퟂ<$ddp۳ӸMp)q6g' D1']f fI@5)1eEny #K`9>Y^{A¯~9颼Qp$ N&က3\6 hx#\phaS18dp%T| yʹve* $."Pd2OUJ rI #ThG *" 1Sӈh*$LRWP{~G23|k=^?RjNqృqoE"cbO a<;x)BTAj3{(wO 1 l/2||ÏLL E ~]jD I,SN3eU֤^#M7WӉ?46&7E 7c'z[A̪wK+,38Ulc)yЛr;vW_ͻcFc{wϊ"vI>q˳N ooԞ*ũ糛\@IC^V)Z:mZ7.:aݪb:U(cNP$ kͺUhuCC^Sp%f6EBԲr{ym[/1 ]1!sksYr x-QXKi5svCLmzo0ro C{AtLt)QS QL b2z$  B;}Rh#c24X1`z˭Y ۡSqYLz.9nl2؉\]?pWM_G{j{>>GaU+le Cb E1CJ""c@F6aI9@Ce~CuJkK\)&̄2\*()3HPw3G"0e3ד 3(!2] g)㳺(ԾI{AA ǽv Ϯ0 !`ki3t4J Uj4 ?QP'5 |{}F pI4PBcV|J*}p¡ͺ7G;a9f$w.tbHY'A T׹ ěPH[G"N \QSeF%.Wr/D"OF$-϶O!g;@,w8'!KNU6ծHy[ :9cj. ]:rʌW3΅@y%ۛjF.B.T=}C5 35gT7M0CH6K֒y#wC#˞5 d!G<4 G ';"hDkOf~`z'>q?b48CEr @T"FB s^c9!LPAD_ߐ%NUm:6D4؏b*ﰥ~ՠ!\E+Cɛ֭y}BVթFvy6ޢnͺUhuCC^S0 .R)d39^j3Ps`& 7$0pW-z޽;DM,جrYnd8vFm lxh}_=(7pT&a<,(7/"@j5A'8Yw{s`l6<؞D-TV'ci*QARѢ[(h0'ᠤܛESq [TYb x?d =lp=91 +wT)άVq6nnςE _nsr&ks/08q:g% r Hdroَc?g봢({?I* lZ =TW Qh6Cfwd:x} c NK`M|57oqe:m<*Cٜum9i |V0,jCa؜c.ֻMakvMn Rq\tArqW0-(?^U|^ÇR^w|V5]huL%(á8ˀ*eCT]F߉(E6DœmCُfⶶN'&PgoG4e HTX;Es.Si㚩i)a4^/\֥$9Hb* .ke4fTnxإbV {lO~ǿ5˯׫f1 A4/+R-qTF@nDh^P Kj(zB#l!DP}o"րF5x3A)Y\Ѻc?rեs۝$FgXP%'#lY-ezyΉS^cbz0=[*U[ىe#kբ2*6dK M5^Xck*CRRa%Cma=xH72 =ӎ=*Sф?{Wƭ dD Zo\O|$S* qM IVݷ1cxIXa1_7FxuTo]pD jCЌ7~mQ!2q*Y +x‘N9!s,,’$AGj,t!$x_eT6,O(%odFlp%LQF(3FF,QA(=ZjIS*);`,O3g-zkaB9*ZAq<1 .VT8bqJ̘S1B 2` #b);$'"\r0j(ncixaYS zMrxeZjZC2{ycWJ| _~'W{z:VWW,"՝6.^zٛWiv4Ln!s\5 ?.,} YJao5n[)"". }Ț5^@^h8X/[]{2ud¡~ ηf{nˮ:8AY]h5ߌ^\Kyҗi~4bG39<`?O};m$?o̭ZM?b5>[D]VEB L~ y729ɮ2 +v'/I1IʩneN~^h/'qY3_u:{F}s :jZHƳ޸{j8X ?#| Ō/~yo^$M%vIW~~Swp⭵/l{ymg͸LS30?όͰ\&Ť\Vdt:e\'Ajw[.@ ~|FWF7&(^ߌaL[^:rٛ^Do]ήS3磅ī.t- _h WՁ&Bwӫံ-!?:O^O7/^hhiq\Y7+|ʴS_cW]E߾> f.{-[v=tws`N~ҝzm鍭߮'_{׽oƾl~c=<N櫔-_՞Mo'$OW]Yj1Oɛ(m=x82'd#E^[X`P ';4S@Ŷզ1-ci:.fKα.@'qrN_z27Isy6guo>hx:]~׆Ys٫av-`MӀ18Uu[ˏfd^7\.Y`Q>&؟7_<lT7;@=F ٧?isG>'V~i3d-f65_{di  +ڹ ۳l|=?zf  t%Bg߾m tLU e+Pz'u[ƳC4XP>$}g>П g=ɂrE2<ͫ,rA Oz{;CjKI952er?z-^QEҎ.mBbWnS>َ\ldv'ɇhg0|$Evg'*kRn^WAB2hy&spKf&W7NBhKos>fxh G32@QH=hBCҫLb(ݧ&Xr Id#vpwI'YVI@›plm3$֗%H57%.Dt $:CX0,GĹ82;I8&^x.#*KoU*v/ )~ n) \`()u.<SYؔ$<Ff뭙 ܈a'=ho7-[^H Սr5r8A~ Up}oF`_֚{N@Ԏ82Czdqɘ$ +K6o-! ߞICH(5L$-9u8M qh"!}]wpJZDBdHDbf»S%%<)L|$XL`iRy+RTp wB:g`ݯM%u_DcoYŏep]ɭش MMĞ1~ H4rR 6+bpYz̢ edlV!Ru1r8;wI-fv8fVOpnY&:xVquPـ(:ɽdJ08F}ǣ2zZ9w)-cє 0v`'͑U(CFfEJP$Isc.$V"űR |PbA-ʙJ8 7J=-)-p xcx_ 넯~ sRʼnou=aџ訴:+ =4O@ m >.6ލNUcA(^$_vn $cJ C u8_ۚg+M8$vP~ 7vh3 =I OpН<6z ,X!p>y#辒J ~h|> W^&33؇a0{=ExB(X{1H|H(^:HZ酈f ņ!eLml)֒F[HP!0(xµM"Ev+ e_-; PͯU0}n;;aE"A 8d%s>7MS"I,I$,S#2q5|%s 7ں|?gZu[cؔӞ/~: |Fff߭h䢞E#Y`mѨ{(Bbm #E)˃/!GjX"2!+t$vƪH:`Uf^׆ 0ʷ3IE׷b㒄lLhZg6se\%_f7}%R>q!g/yS,<\:N6k=\2W5M'QG:EJ8Zy';tpIML,I .Qp1lA>&iOy˽u.]Eh@B()!8Yp{G)xz* u"u'Tvۼ[#YC!~t&Zc= #r]yt+5pu`%Ք;E?4~(|_`sY eq3xV[< P.Yw(PrfUޕ5q#D[2yXewc&O(pHܤ&y$" V*/$2//l9ď'ZXǀݟ;L{Kg'Tv4>5 PVٖڽJՅ˙-fǛQlp*%T 1z7uC³ӢpS\~fԤ-x FThTy/ܽ6" 쥃jelR.}QU _@uH7/D2;֢6V DN'u#[z1Ƃ{F~{8.%a 1_oo8Y.Wo~|q krk~[mժv qqq*6@Q!)L+l| 'B&sɐT \9[ۥ^^ϯO,Ν>v FwRg3t_c7O.r磿__j4qK'lw[|_KXp7i^>7W2OBP|M]H"}"$BZ.lJbqGй1bޙD7-~qnB,VsQm}ٌX,2`-E+$~7-#%ZֵɳSk ^̍JE>dIy.5A_R*KEVCJ_j*t[/XS0clفOc(( 5}X,Ђ&3-m.ɇzZEBˁƵn,cX5C#4 GH^y*x H:"@敾ҭ4i}XS̵JWR-+=6B+r[V2Yz?o9ѿB:14m}k \u/y1)ECc?hw!_}_z^]}UeU^]}t21AN612.By:̷a I@|Rъd"fh|qf9*5&f 0Onj'}"4 #& +#7arXa07VN"dO˩Upz=W9 ^/8RCxuY \LBnDbEL|{-g% ZUT/E-`@hb~}x=&qR?y)H:?y3\{KK=4z(կ^o轈{D<a\BDdVRmB9zteFm\qU'0٠I9QŸkc//IQ[%KY"9z5e:<3CtXٰeܽze-8rzgrLkd;'VJmKMrQkKGbᕩ;tF+XwK9LwOM=ZMOòqY٨Zpo+MW}'g@4=Zة&@5rȄO 8M:Fzqv~ٺ7 cw7_[ \[RD;pgSl/gOT9Sku)L_pPLD!( iY{w;n4wÙ6OӮ{ Y]un~z寮F*Ӛu=9#sBk>|.ןۻ ؿ["]Uh5Z8{Xz Bs-=ތ'd8y*6PI6VײZd]RZZШ. NX)rKa}Y!Uk'&Mj]\YS{Dd3*9 `T4hO# <@4p9ˮ> S]έ(5Zn0Jqs}cE?wnt'#y_웿|mݏyXؠXT5z7(h5wH/V:9g ;%c oԨ R##?//T[:O/6ia!Sm:=C/} ODF+5(̑  ުa!vkTlLWf280؛J(,h'D$R0FK (9zNJ9ڗZP2y BhB~Ƃѡf+Z$`3j»QqZ gDSTٜpuF*=9xN,@|b؄EsxC1quA <6:/5-Ċkm)JmQPcgRɖؑaFP9pgξ>eS4`@ 9*&\x%k=g:SSݒьoؚ}$ ND'νwEل(e$8Ȑ p%4DLTH{H5BX)Ŷ.i쇟ww(z5bœgoySî" o~FX 9 yHbGN_4p|˫ƃ3elmЖ5%,?+dpc?L!i,D$bNYj-Il-A#ț&-g-tay?>㷷(L2Kq!ąa-~ ujh}A5Mqd<, {'j,dA^%Pq+"O)LlQˠjَ ]aJiT]dƞmy8`gF.Jqsc׻)V;*AېoҥHh m)tHu\v|Ƿ,Y^CȚ E zkJ~,XFERqK/gʆ$79t ; xh\<V4f^{.qc:;GdNJoB3_B RM ZdX0OzLK: Vh¼mN,wֻYJ,z{%/XdkLF;t e)`]Ђ<|ؿП4B4%ίFG0VPùn_}dPϧv]>)Brv]N.*.lPTbSk۽6kØrL;ɭ*1!EHZ:Rz*tot63LQE"Qƞ)Pdas%paK.^h0 "@B HjOY.S& E8r6zY]]BHMW(/_* 5'g(ns"ĢfCTOt-y ?dE&N86nium JLgwxSfJӻMfԻDlBk QYilDYnO7 hUu/wSD[ߓ̕ڨK1ЯL "m3C)!Ԡ Q|P #F,_K.dZaHpץ#Cu'iXL,7g`3>.zr@iͪ!+XIwF3BtLk}֒vZe!2Og1bկ7VVLPؗ/O1`˫dN4 t& Ksw6d@I0x]PU6>™b{@ˈa U*=4fn.]}D\~CTLLFJ,(}xC.|'?>.8#e` SDk rg(rˬ9֧i-(f4ju.?|]sem4,Ի5B@|MJkqӳv25)UhT2P؂2ԂR?JMfAGsWY8?glr4XQo1pQqɩȴ^?e< N4J5*MMɥԌ|5Hn *OՂAsg?jv\;*HuW~N5r|WjIمl=*:LJM Fp3?9 T Gއ+ɹO&x<( p(8\=5" <9[FJn\.]LW7W}(H~EYP*i͜hi>ΖpsP"vk% `\㲏kj* w_kEb+r|x,߿ާ6aK5g,{%<ހͫO?)B2eчK+24)ckة}fv#fWBj]@sC}.>1枾zf=ð2xĭ.Mf *!& 0!iͩOXA'`w'8C*RD~ҽ'Zo?fU_?ߖ؍ϕ핻&˼Ҳv{-=V'Ю=.x[&RO>^=t$zQ1͠MP Nnr̘O.8ݻs.v\?ê^è[[D$:|},"dnw` J(Y~:?LjSK"MNOߵwE{%xprUmfEqU3UAz*fӗS\bl(_n?{Q*hԗc Vp(<.H˩lB"wZ jgUB.`$^ N;/sLϢU*R'XҺMB9uGzF$!H0VƛAU(\`@ x;#MR1R# X6VjEJ+'m'5"\HG˟.m|a1咫DR 7_%@V)+_zAVʍp!8(taԋF[|C_S 6E1 D*~$B4; Lt~N52p[rO%klȗNlc=$ZЧl=)C d@сT(x):1"*)G)Ya+< 1e ˙`:Jw.Nv ЊY5mJrW$dP܃\(z+SH$&I@SPUy5joPަQ:Gҏ2+HYNDnz+4}@` БYz\dX-'@bƎGN C)BhY/ՑAZU==rlTH>CLmY-:X3bY^xDeKiʕ ؑ]xh* M q2}KdR܉'Pm$;f4/c/EQ$d@"ZoxKɕ\II{ڙ\Gx 1>2qd:D-yiDG" F>::a[/8'=3}d&G"S{#28NC>S0H'E{VK\zJ`O j>r7ꌲK ޼NQd 5@p̱M2ц%qvjYH$lOn(J77ݭ iP[@7N](NZă^GF!:$ )!%+H]da^.G~EʲeI_RN]da-)ο u!L#pIo|Ba x;G^Ft3 E5.5B-S䞈GSC3:,iDt#"AgZr9\ww:ѩ[Ž#yukhLNS95#/Y Fݧ:Fw, | //ELLw_w}Q!O +l;QD^a\oKxk:t`pЂr0vNE>:E\T,ڗr#+s˻}JSzjeD[.R@kh3`;Tobw$1[(Y`vLV/z+=c[׿'"q'x"ki:?٣#0ހ>D^<}t:&4j=F)SqTiW2:9iwt4r:@) 5A~T)q=Hh\slvܶٯV$0j4#z3a ^B l[, 5s&͊(9Bx`LجY ϰNYMV̭]$ɨL ^XfF#e!Q)Cn1d ft:^"k 9[1jN⍵DO{J T;o%ϹkKEZrij\l_<$ &fK}q3it&/>XmH oRe,ej3I^pRR.*# @X@1ޓ]a|/ \9i@j9`;H5@yE`;Ұ~g=!EI-W׫Ab0ghN6tɧGӷYnEn'4Kg۳+

 Bz )}Q!&TדM^m٬#/C4zmppr{Σ{Le?gh ѫ"<R"y',1q\-~˩/<اG3D!t\gJmGcB;߇wclͮ0hHCf-R+ZslG4;/4 )[{{ub_fX{0&+;C,wϿ~tj+}W(Wx5Hm{K8o_? +&B~4GHQ_!$dot%yt"Ds;ٗ^H^4 a.7[#Eet92;rWs,,ږ.T&OL`a߉etf]s]X].|.aPn+5Ǎ24+(ԑjb˅r+O\E^Zy`b܉3#Hg{:rkް>wa +ӇQG }N!v)1jn^7,u1b.Y;Bw:YMrƁqkް~asSislNs#ڑM1tƃt=urhH̥ȴ5Ϛ}nB{My{LשݕؕѮ9sE2Ya"V+A=c.!yG[1]Ce¨"`d`w,lI0>1 #LpJO!G8h񫒂ki\¢t(ĈP + r$U@Z v({ݡeΉ]/حg1Krg; g>N' rLJ` nR=:7p=f?(u;yܯCix`n9"i6'K ˕.Ng{5C79"FΣ rاOA=UsJ1j6?{M~/:qv1qhEzO6 6 1Y\_}O>j>ΐZݴi !y(ЗѓGSǠ.Δ2p vIH'Am-DxJg/K~Ek/7GY;9^so N"Y-^Ӷ|;i$#ӫEټyA[$uq&.>FM^=OU_e`&_Z&'[5jۑwv9\S⑼5Dbb$0/-7wbquU|AT5Ne_}CO|.I"Wgnw^2|T  fc.a&Y~bY":/ruD#9TsQVD;)f9d{ղzW@K 2;+~]:ync F?Z 9RMs20\XCFbI16aTsoV*(;K^^P '1h C? ^Ȧ'?k{fpѧ0g}=1z^ҰhG=,0-SjyG#<r9~[w$[,xB(j {haRv3w>R|+Q> [5$^n5Sk[S 4t ] }qTN\%xN~*Cegf6x/rz{4lU>;WH.s95r&b(g:l=^ve-ږm!JlOyCHRou-ZAR-Ŭť;2 .ڵRM%%N昧1R{o {kYjuti Ŗ״ݣ`.Zfq`Ƚw%6UQX~ ͽGy_ptD*&fqIbwfeW|dk*Y33lN;k7Ng4ˢ HC*Ib7i;YYC#N7/R (`!tF57aZwYNMMh{n{^c՞,oBk6]{:qy֍Yz滊#kPw?udU#2Zy,5iW{yjcK$?dyg9KGT޼]܌){toDJF~h\ CAJARuT-=y h"zo~(ߛ> ?~vhSVnshm5([rE, #fz"H1!RjJF>D(:̰/չ5qU6~bH( XyB)b΀X{&y6+7ZUvAo"=أ!%X5ܙDŠS&>NnKBwO9s-A=lgfC?]_B& "!g\C#('7ZMl.w >{=ð`tϋ:s޳wmqH42tk< bb HvM ,iugߗӭ}y(}U$+[C&vylLj%#!cCӖ>*7Z!-Kg_LJ ڮ5o 1~ v,̺Aݯaدa53AhNDA K!H͎ W$FQ T>}vi!ݱTV% :QmDj)۞#sxK gX 0Nۋ߽z(-$;EZfSv_2Ы2HsT&ȸ mͮt(Gw @a2 * (*mB0U&\05ZВ aEa1 ІXأj 0yZx$IReijV]1mI0.8 @hgg󪹺}< 8sq:ݽy"h# a";#{ '# 1Uys{suq>b+WXX0u[\&zv3vqQAwĺqusa R|qw>D&+QXtM+UkLtk4X%WfN!:9JcNV1SΛP>=`y:qsmqVfl:YJ !C8]J.PN;ɹ+HN;ȩAD3xb3],rѩD 6ȇSihpσTRTB;^jə0GJ Ge鼫5mQ/=nmhLFZ[QRKO"i؎nˀѷ/,9;/Ųo )۱?~?&I֓Uؼ;ݒYSXvfߝGpVvU Vrm(r. VYwoWoN˷~޿{Ә€N1DæԊu*6ϱM%hi1t6VdJg6Rga3R6tFvIՎh-MB#My0RPfhJYŒ($5Z KmT4\4My.J-lj#A{%;?>Hy![Ԫ^6]ܘ OTslfGњyCm ~^EYjnpX5Dwr8@<5u#gV%|Jy.pz`KF }> M1f9qN*9I?0wJ^i,A޽˻=IYC;qwGd;O$9@dymq ]1b(mEi_սJ^iOYc :v'FOm#=R|sC'}A/"+>(a eǍ +Y.F `Z8i>}¤^ *)}oO$G۫X@xnV]q>;DVӪX|P`o={C[QU;RsT `y394c|Gw^Q<{r)SF_$적ZubGiaѮ=R_ܬ8,fK?]=ޠ@=3C:ok~} y}d›M~}Bx"4u ۓ/yiE*=3d.1bUh)yO.LĿtFtudTnȾ?5ĮO?;A :l[vx6"+܂7`jyV%ascH( VN_h)wɵb?e:k..JT̿_6:>-VaWwkbG}dG%cȾMrOs՗~bU.fRv/~(.hM請33U5"AE}Tj41ps~C^b@b8h ֊ğ?כU3d9Kh9+VbNoƤMmAokD^7ī!x^0RtH?2Ӌc4O:;wV}).'֗0 Jwt(5(p+?]L AۣqBxi&T$ئ 3 *޼"xșa:NDt'imЃ5ʼn}A!4)HLbS<^e֧9\B&=r.dwn,uPXf؀ڴ3Un5{f˩cF>;es}Ӵ˛2??3g'hOF{ ż,$o hKYB*MemTBM* 0+E"u*)wVMzЬd&A7Wh|5?,9t^lwZ~v-1mZ#;Lf?6Usy*cO.]5Vh1IIM-Y nl>.o,>-KVR]{E5ԪҲ)R\bm_6kʺ䠴.\* Rn;1*x[{yX1Rh(} +r@b]3(P+g4hH-o_y(M >>p!a +o&KnRBiמs [8*ZH^g@LUaDj]#bYWh[YMeʦ(é`X!ߐگI{na, =HO/O>>xG@ko^.1ʦ=t,2I* ϔFd Fl_P:p.a$G -it.oz2#9Fm:Ϯ4eqxWM*z/ƽ'3zؗTx% a"(qGrΦ4lGstmrCQ1AI9#x"6$OdN 195qZ5ON)Pj`~ꝴPyQYmDveӂGy-TAz֔ Vpל,O< p|v YY?,Hl|yJ?͍1\rc-dkD %ێǏ71˷jY$5gω [׉T}O;)^Pxǫ {<2Ń^;laO=xƜ(#hhZZbh}Лh^jF)u+; R S4ZC mVU]6,65иRk!I'5Bؕ jm$ `?K+puachPٯ\ JK5Q5 i@)ș h̀,C҇u+gXӃá*ruꂡ30IuK!ݿu6<m%<+2 G?{W8n_5T6S˗\M8,$yWE٠"MfwL~h4L;"Āɉ;"!bC!Vq  ŞH3I̘ݐhe6ͨOh՜L&\p h"=.F+DZ+9y1ؼ[5AYmRV =Yduz>kͧfM`?e&Њޙ >qrx*G~[86v:Ւ-T=-k=}-cqӎs _|u5[ESƣk28Y9J{ieY)[00_ qX^4WPam@͊̚HRy]qJ'-!kUO#@^WlF`Xt!ݴ y^5奨 zQԔ[1 ]%4.ԴG߇= E5aD+eӴTTZ`S_y(ѨjFG}6_ha3.ڨ2׀}:su׃Ϧrm<]8wA7^&# ^:nJ\햳sw]'( 8NvҘX å@*|Sv!AĊ>FXG3etǕ pQKH5hV_ x̳MArf< Ä#)ϋ=%eG]!~ƀ? S0OU]229A 5#aQP )[4R9%*F6P#ߦ?߶۫o{-Bk/y$/'.T*1[ qp>raBf"`@d*ԸL2c2F7PVYpj<ٷO Fe]D1]W?C<}e%DJo(t?>'c·~~X(aZW[Gz?in@&']wލ,W :g3ƎN߾'0ꣴPW3ZyzZw$o"Wwt?w{(I+ ʮˀcK{C-a-e@dlCZ^XLkX~sk/~B=\d͓um$8`XiN,w˷W?L!(_<\_/7~e^}󆠜yVo83[ lpWOs~o~iIzg4Q2Vj- j?ˋunPO)hA8 IXfIQD(M|$I4R7V3>bbṀTsi5LߝH,av4_K^r4a_ѣ gf7׳G46x}?/G+ef|+:#T_۫pwyšU&oTPeir糨" TBY+IIȔ|ƹ\SôA`',ҺiV Qo*wr.^vп_s:ͻ t fQ-,-V@w?3M%vnRAc(ÂFbA'HTZ4huUhe|ljsw;>쌡fALP$SF uYVy._c vg0s4Tjȿ^[ƭBTy:AmdT4s0ϳ~ιDRE!xnBA`+;X֋YXCtVXƺ;Ϊ1JKkح+:T8ڹD\{GDL$~j175N7NOr^K{7D+zVAn~vޗ`|8}wl%@S2zՏхEcG0svM8uz]8j@d4Kcu-L>R|I ]:IW *E]۽9b Dh6֒jMp+jsPJtY< }TpcAdCc:mcvq%lӿZc Ϫ BޅׂC%Y+#LCxÉp_p=>vHͨÇN[{X< G{u_l包`^zآ/\kwlk)uδw/! Ju:S] TKNCsR<Tf纄6OlkEMwͨOu~^#~f]ylm*c w ˧{7bѠ)D'w--'R13yM{]v&DKuU~ա %$LԺFRl"Z:D@bi-/5z OȻs /^e ΖZPˡzRK^) b!>"âSfG*mcqKљ{JðhK[h l*lW8n6oPqw ?hmGέ?Kz0c*HD"Jh|Tc:Nb| TZO=٭+Ph-Y+7F6Ň" z:1ߨ. ֛wk^ڰWnlJyݔCnMuc:MQǻ]2PznKGwkB^FN$pS;*nLriy*,Z7_2Pj|MRڞ1rXw*k߃dttL rS?a[}|q)a? Hp}x\~,c| Sٰd.#(_J kӭ΅2f۫w[YpJiHF@/98KR77EKNv/[Thʢʄ10 tDFۓWJ^H"FnR1Ԛ B e7+ORYE7/L!FAtw ^(pN)"x GVpx _ RC5|JK2nKKJĄbc^1VÄ /ah ɕt/ZU.+4cV{Gю34&GPfӔig lyJ29T%$dQ{ֽ6u%ΈTB2ȍ#z͕&B U$d\YHH+|h: '/P|So-պջg{bH1O<`4S3 Ԛ(,ZWQefj^ZT&MϾOZ*_m݌DŤ.d/c6N,:f92HM1 %H yj"M*mH5djjԑjIá]+1,c,aDil\[̘έ8n8G_paBfyA@C;I$3V(c 4cIkNmbbv-q&O?.mqXYU58SyzʘO HYϸDKYd{5i"=ʇEcMdA!P HD.Q0"ISkD>ZGj2>Y?Did-%$3٣T,m+,O[j]h%$ON>%L*C gr5(LDmDpm?3N's2XS9k*N5k]N!c jI֬;H25##uNP6Odߧ  RPEc)59`S BHpȨ>9T;PP%Br+\Mm=^\56]w$ƛI/}5QB?~g4_ZlmSg !$+븃CA s`h(g>(;' oqeź佺KCl~s]vUO$U#iYp8-VƹdSrBBY9-oHͥUʆ L9%p37]ܯ]2Nbf>m~)FG%/.sJEŗ(' 2#Y9%0K@_SrRB-ZX-( J,^2G҃q'6Q}3U/G(HO|UIy4 i}~S!\ҏ}-R}zAG D|h2ʇ0Gi?sT+ UUx%juy4n͢_x@*~.}xd DGldc BBx]t  AM.Q_)xIV 3[!uxF.Z=H#ߵx"x|ERC*TU>H:~4uqn^= iw~ř˷ǠQ[<Dt)-P;"?>tJFvEl"Є0'hrr!6ciju"d,p)(rG4,@{Z{s6:8 48GAU>:S, 5TI|B>:a Q)Bǽ*G'@ RP>1IQN }B"<٢rgJ{H_!evvTއZ`Vshx ԑ%qLI=Usk(@#R2h_27{98(FQp?CZ~7-P;lXr#η}eKb۵;ㅻ $$ \D(QJU̒( D(I(JG%“ .d-%PI.5O'%eTcR emb$I4 41@0"0h5~!&IDI,!F"e?ݚ~eqS) ! $@$ 4  CZ 0 P D)` YeuNoW0J'"$_Cz8 _6LIS90f>"NZ`t=t}Z-'6~O8+3OSO|*6sm m+xjG0%Z,{cUNCiԷfYf8Sn!{'ܙeNxM yGoߺܑbQƺu7:m4mݼ;j` sϺi9ݡu:mn]"4Q5QWVC^8E RR# lk}9afb˞`:ݧdh,X__\~ PK1jy`%!v 8oin:lqif$|{BN.m)pcH~dFo/f `;f0}ZΞ~]]^b"GWJjuEm[n`2XF)iMc RM򴰦?p i:̖ڂ7+>ݤb -܃R͖$B NcKr?]t@cOs.s}" J+޼?aPEygss'Q*$JqR!`/8%4ő` PB!crXG D.})lAjX36 9ύ$̖TXzZꛢԜ YKI)q%KOI}Sy+Kύ@;'m,=%MQj~eYa72l=.YzJꛢ01}eyS7rjKOI}'gRzsVݱ7E%¯gRRXh&屜s;Q~LG/v#OzSU#SaW@ki()g&D݇@_\ =4a{Hă7?~0AX뿝ub_-,玻VֱRXRJ)߅}pN:WI1琏JF].(% t5lv?Q򮺻]PXPkɶڈ+ y{u+l5 aD(:Dz(CB!>Xr;. 'P=s&,➗IMG6%[TVL"N :?oJюJ#d"ۻ32|rk؃!ڨm$i+ ZWVN37UՃ9i+RwW UZ v[^<'f Ħ?FM.Ps5~V#,!*!AFfd2H Չ^*XbWVնxNǗ;8)Hg(a=C'=7S93$%-^V}kxCf_'K<:\)^zcGmBCs>gӭɧBs&iij[R*|):d "OsPK81G1{*vO=|<2biQcj@KPsCQ&$&_lo>Bd6kn߆ h B~,K?2E-M18n2mQB“2%uu]6X/" zKQXAX;[[n;ADif"6SKc7Q8m#EU ͘aZcgېAm|Q&/X-=% L@`28GAQ IAű o,6E2z@P|6q77!^ya&Ŕbnm>RT|ga"9D$#:@&X462b4(ESeQ'g~mO[!sĐ4K7ܲ56D`M$Ir.Fo=N4xAR}b8KvA*]ǻZsȦU>.\??-9q/$0ΥA>:Xa{pE@[}5NV@6.8Lq6?D3[Y[T+&e:?V@FuhurZZP1mk.>c6/ļ_KK0i N`ĝ72B(Hy&ҰC>\w$?#J̻u+SYkjԺ ~- .;-w9Z#:grPϟ vNJuUPs(r܄ܱ=Q]j2Mx3I< A}} B"*AﰚRե=eNw[@ov+!aZn6@i@|e)oz{kYDs4}Y{{b>yc`m?l|$/f~okYnIhfo$#X4gLZ[CA}j:>P?{W f=R!r:_$Ļgm~(Svdu {pu{ŶM'Ow#=n/`Ozi+qm3r;_i~jK تc^~Tsn]8 *jT"Jq{>uhtV]hQq; Nu5ֈ&=GX|ukR}@@U<UZFKjG LACo,O~ Hv7/š׫lPU;85О-=]y{riԄ^>br6F N[ }w<>!/>L~yghyQƺuauDk?u!/>„~fϺ|r/.mupk t*۳n8[ y)&"Mkmp46+N a*PǛca9av#¶Nq%L^$.1M:Qw{in3?\Xgfb;Yy }okzp`7hK⚁o/'rN^rնOλt9Va F:S.)oR Kj>1ǼL@uXC CD0P  av/ c-HJBa01AQ(%K2*BZAbx$ #EJhr `9(t>_>NCM_oƳa88l-_X echF tMUp%-x$sva|}sX?Cܒra9x4 wyM.r$ZaNiאk,M!xCMsJQØ[XfOd֏f=k:kH^g\9p鞻M.Z&+ߨHa<9Lu=nikc~[g =J^lpXCI{IMGװ9~Ux*pIx׏&Q5Z Fk [Rh[R n`bmH lxkC#q,bDP %nS P?K|m8:Ңpɻw>ù[n%(Gck~T%|,_$s4/^I?%͞"6;=>I 0\ &%cA|ڻt-7}T!WX!3Crt%y,LJ49fG]WA%nTUSI]8\n׮T27wㅵ 6K%+<-zmy/pԥ҉j7ƫ&(ԥI-w>Zj5`FcN?A%%dq#{Ja_߾i=$^M_zV)U aZ_o0 Mo)Dz=MQg%]eQ#0%<`ugO`}'迿чt,˖: pۉˋgZ6B!daj?7q]~$u8ȫ {\oH`y(H_OOLwOMQʑ!^paA#?KOo~߸8~ǸE,ZHگݪ/ި=c@#oڟty_B^&a¸Eյ׶T~mÛ/Vcykz(O_~ѣ]^pˋI-qѻo(R Г/DţN0>;9EP 8+)r Xo$G2!M齣3@;Fj«aeV *9KjT3me^kc_vci#JwF?f(3@5e]>@: dY<fC)fѯf05ӻwX;30Zeިcy8G1]1Y%@Ѫf3"^إS3Pp6S揥f}幬-x.)']SO֊UWR wA.K<5")&ϴ1Ii$S䇒l(j PdӂL][d*jl8eTO"cxc41D*H3BQ0R;Ŧ;=0xAxkH3`P_.U!=8k tQ K^= waBRryrr֬M\V&Ic~ziGAW?g>nxL2Lxi=VM) [W?i Մ[}/źq (Bxa&n 3Nx1rkLO+$ϑpN8g>1tq[ / /xAvD+N QZeFTT;[) ~ "?d6ځIQ`S+w`ɍ.Zr?(s%>A[1.N -FS;ۈ]ɸg4s29LPqOm|ur8A#䡇S,ϯX=(Ty.ʂA>j_:=D$wD}Qmm}9Jrj<6ܝY9$ h.V"V'Y)ݻ2;В~)Y0 d8T&-qMX@_7bߦcӊd6n΁y %Y\h%V9&}$'و#֎[;bn#&}vD{^,~:pD1U8Vg7`;eeY]] Yp`T MIbYP-Ւ&M)*&bEOBJQ}jAT#g-}R jRHYKir{ӥrVj֥Q|B(( FB@hDUC0K[ 6 ;_vdۮfdC yxrwyH:ٷۉY!? F{o}J\Z}@I,Dx64+$Wm? -_ol/-6|xgMr6j1S%(l @KΥ>T-utyo$ [B޺fT,kK*Fh]-LPVtsC;LD޴?C[67> 6HZTkDh?-2Dz p.¶l}E.oHoIpE5CsRz&|QAuYHnb[tQ#$)RM3ϐ3>es`l|9bb ս7z. 5mX,= ac"2;pY0U0;y@b|9jbȚ*#ϑ؎;F{9Bt}6` `sƮ"C88qjBXݸyPM@4>SӃ Ɓs~(-xk;3 :>~= 'Lr[4.~0\p|rɁ`;W9;Hy|ߎ0>u5]ZT-=Z fV-,]b\%%;E,}n!DpYNZg0ZNj8//q6\}\q.#{뭉&K;r7kk}:247?5~²7o&#sTXܧ"b_$Ig>=n߆0BE(7MCF<}6ÃQjSGۮTR'7`-0mFcbn6M=̒ǣ~ 7i]A`lnbhc`;]vSaG ;4,[ `Ŋ< {t< ,%8d.ɩUɧ`"m·g-cqwsп8=>"ov彪pq=~x1ؿx{o`C9—7 [*79f %_^d۫8`zJ^{}캻?|HC75_/s*/|=?" ʖ]Ipi{;s/M}Sj<y,ٯ7Gf|d[/?7R7w7@蝤8.Auo#BA[M'7p&G~^QH\_74cxAjb_dI_ԙ%R{I?YTN•9Ole&o>B:ۅGO.W=wgU(}&WqdǠ~{ohSwJ|h߬KgvkUs!jc;`'.dʅ/~OjဤG GkQK.Ь2>VfW(p:J$§9swVh-bǏ+sjbkJ\ٳ2%A繦D5F^>&L(|D R\RS0e$y8rY$E㭵-"څ׾5@Zi|T8>{zvs0!H!7/g9rs M3 xMD: Z_$M:Dܓ.EЋ[W?Ni!I4v07?.a*2F5UDvkqOB3@Qxuf]P-/R7>BEfk|5t)mrַj IF^`.wfELYt2N? (ٻdԻDv{JjuygΖC mI0j-p][dXC%>^= El,gݯVC]g 8BiժGk xL ȴf ^ ndf7?SCEu];[{s8,֜g 9l_2ؤA!4/m8:tٞ=c,3/, |߈MJuk=G-g*8ި^_C%BCX@9ߦ"c:( g"kv0/cS-16GS 5I "m>lTUx菋Bp;q$)|clF[5;)vЪ6ak4&H0-%VS< 1Qu&S31U X14RUp1 fqiKPo2?,~|^mO7]/7/4ZN67Cݧ ŭ[+V(nB?DC)8Oe 0Bgk-o-mY5f CcVֵømpιR) R"ZtX֕`qr/i^Λ~r;͉Z28Ų'sXA.QTX:1_; -g 0yN>ƞy\Jp<*b "AK*g WSYvI]}k^MX Igٜtο4{ekk}.mk5vh%I‘v̊+Xtb l5[6xcA~*=K|Bܹ[ZJ~}˨g a )!VL)|cgښܶ_M)5.SspvSqu=eH'V.GP )j,W2Pdc t-uA*-4WiJxǒSCk䧶_пؕHŤK .aM :GV+Y\kH a1=Rj dP&π)[0+ r"9qkiI Ԓ RE睞~lj_g z 18Ka Sa<_'zkS 5E z@~Vsn5p?{z6x>4Kӯ](-vYpC0θ1P{]bߞ_l3獵Xg-dd$fPG;΀ޡ0O 9[s;7Wʼn,D( 1 '=E-'G%ezۘ;l%4b^*h|mdԅ޶+z[YD+X7SBwxz*ufD.h 67~4z՘zSh3WIR$Ne/_)FAOFIs :YEIF!Fic"xS 5@;qƒʆFR3g m缜C!*v:=U緈vK|X$1u"M,"|x9F`2S{a~[b9ZTfߍ\+偩dzɭx({f#?xCzq3}"0[0,ϭUPi.s k)=ΈTA 2(2M |O?5Ɵ֗gTZ0jYY?mxtTCdTo5 U1lRSSKQ,2E 1#!rw`4XXpb QN hW;-VC%V3EsT3TǑEy|׫_l1NAJ}3NUN#zhn 4y𜉯C#cESvVn(w 㹦b`,BX s"fόM"HƾyKlG8vTmE4Rt wYPMʲyYc3](P)1kM2Y32y++BgZdZdc/z:Y1+Y(gDzΖ[C E"kR(umW!~0WHH:0[ԸNm+<܀uvRn!Y'##b#bxĭq!mY;$m_v iH䪑1a|WTF$rB1Fk&2.kB`<)&0mt.΍*"uw,߹-$:ƫOVֈY47Cbո>3CG 8{L_l:ޤkB_m05T1`$[>Ej㟱8 L͖=# 67Mz#25zŎ.5}kY"c9`SWcFR'g/5\IK̏gmfAaHzͼjUPu$pOAu'iIq$Q"d)8@.5f Su&T νM- " @7LA0 5XxTSP' 7hO:m:NN$n!8D07y7BzT@'M 7MͻקEz6C4 S݄Aɼ?thrl!ŴXcBSZꆘ43%{{RtwEJf `%bMOu SU_}pZt_<,B͏ͮbՠs{.{n uF߳#vϒYr_ϒ;lYB)Qs=$<=U7X|3s-QdMwy. ;̚3C֪`5bXY R_JA\v:`Cz.ӏxСb " 7C N%ۤYACpDxJf[s L$),gms!hCqFJ:pX놣_fY-8yt)bYl,l2h7{-dMNbD5^8lb>{J 1V oz̝'M5ڀ>]l>:ml@Pxy?ٻ2U?Z]DCxnS>j3_PL2JoG bYq-aG5`åىD;~T lb%N)ߦ6rɪeh lUF)=9 $E>[QT-Z9~H p7?˲.4kʝ}8sr>>Mrg/x*wW_W-d>g;5Ѽx뚆HQa[[7IaWL&#?ΫзIk.fD H΁0+,i2Q}X 6F2WaeZܰp|K&H*zh ˭&{Ӥˋי]Ya:(w{Tٷ2W#,vϿ_ MBS6$wSQ3 =2If=vizlB( b! TSU[LSWW]_}odE ^?/~w@y$6 <1敋a/2 ,L"oȔ1AFZ5e)HeeώEVX $JS0ܭ2gH3IH ȹ޳6#WXJHٍvVrZ\5`leeI%RCJD"mvoMqit7n+pU%-$[aՠvpȺ1ѓ˃l,kzzR/o??IR#]LUgփY̆뙙HHֻYVخ"a%0֣"ɯ}!}zw.]7'؝'R;: RFs uGo@pݴ\c[ ̀;rTѣlp#h,i ] U WNVPw㭡8,8I8`d^WJg$h'h;ΞP0)zd4=o(PBO>rdQ+;^05RX< 8_ԤTZLyF"5$ m>c$o!>6*:@/ui [/WUBx4*0KTʚka|C gvYj-PmaiS0Z|PH@͘`; BjY$%jcPX@!2 *lKZ4O$8/H=VbX=[Q4do.:13':աy/~f3$o$|I݅I ÿ>ώ81 9AH&ïrzWp0GLԺwF wݹ(A6Do@".NIU\^Zs/7G(໛B.o."w>~q{xyz \{W8ÍM3^D_IHb@i43C5]Cq,Q7*ɩ8Ҭ#׍އ?kfFN{w~G2Le X7-TyuY_Q ?E)b1~^sy e3]e{]No^zjZzo(Gt*;<4a:0RcEO!5ۀHİ@&`lZS~IT 4+74X{(;A`=X^;4Ua5z6}g^9EZ]^~zp*9>bS5[OhțזНT$d? /2:U:Љ[ɫcҴ7A䡠,μmHq)A]г{]g"!ovk+1]Mz4d[Ip4\4NYPN`;jEШ `A1ړ#1/?"3H.n7hrg&ԨA5ќjF/瘭 ԇ{sqw_nt7Y|_C.܅\NI ?}ex{@ ?hί卟? :P}"4`4/\. <|u(x^aXrS_4H t!8)PNޖnԐ#J8Haȑn]n+$[h+25-fM ҭ-RX;rۘVi|n,&J.8gѣy 4YK_s/"Nnn?Ճ LF3'\RܽV~ IWwyLk_M/@&V(K=cƀޙwغټts!`OZJn:x۫O~O kv)ˏן8H[& 4*"--BmMgpDid!C̽Q$J[P͔MHvP(OGt꿷mbĹhdcTWޭL Axs/Xø/|vi[ؘ64}WoW63w׊H6P;,N9e1 v B)aJ27%~ʐ\;kt;}ڛsFpDj{HlJЮ/yVq߄rSwɦ@]_DmE4our*h @m}9 fM3p}0)9<76}'$V7F_褞-[6">iNIs1$ͩk3&R2uҜ3`4nO4u`; )RE2"H n-rgy斬DEˈ7e``c@[|cZz{qzG7=*Qղ1ߣ *gBk428"%Di$] H㷰$ o쓲<$yN4ʲ[ɏ}F@6*fټKZo415 tNyj1-o檏SWkP̫ۻuqiqʍSUq?ʩ1+MN`:@ r8}̃LF`aUI_u/kIq q,Ar칢EIј*w" (&3n1Mi;êlT(EQ`9a(,+ýQZ/iIp[鹚PiS? 1l1# !B*„KB_H|%KTQ ЍA[HrC?=\=(Bts3ٮ s(H~xѦ(&rM

W1L7QdRn@ܙJM0`EQzn0\KQlvn_b}*V5c2}?!L1ASJr%SVؕ+y 4A>Q@'Bϸ͞h])5 f2㔱VX7qi':0V_R1%c섓L൒Bj>ro@( JZҖ/J.*{#5YFrbٱj|/[=< $42O"? `{^805Df#x41o \FSOQ%QdKڊ!FqCE%pfH0+4ޔdFI. A0v(,u"UNU1TNْ%  O#8/r 1 {Rw%A #829[~91dC&[>0;;INBȊ Kz3Ca-/FIEcM93MHyZc>FP 8;90ҹD=Kx煞 mi i DX,ȫ;PNy##(iu6H)"Tp6\"N2ΒqU@Ã$d8'3#[={On9^^&',wwUrK:E9yȠboʠ's #)+KI߼ [mZu@hO/fKݦĄӄLpBUwK~hi02VK>~y0Zo VH:RJTF90+JʩpTÚ c.eXUKޑR1&K];%c;Γ@JCq #_aP_5E[1h=PR΋t9D B9a AFW@?p 008M#qUb5Q{ i9_]:qj|ᫀo?D"DI f:ֺ9^'@(-(FV*uD#JicFoq?k5g\cww9iד8yKdYD1Y~m|l%+GS|R DkBfdmrQ%dLUĝ0Uke8UNʕ^j(\BȠ5sP`㢍m1y< S{sAˁava3`t/;h8^;HItnU@Pj4iL P\')K>YW7.[of$բv-M߽uBI}]X}c2_v=񩮺[kxls;}Otq7kFz|AS]*[ 6/,!B[a6>m 5JM޿}n}M5[Q{[(Lib愕 0A/ow|W# iퟡWvʎ\ϩ~x/Zﴗ|x]cm*A;܀[Eݰ}x,dn>|S<靭oBy2L:f̃ ѡbB=t' V&B711t*Elh !*֘O Io*yoW&NV]M9ZcM .WI,[nk=7Y”E1ѱO);hxdq%d: dQ,0,T.cz zfH@~^1;zM[}/[E.!*[fI6nȳ7>TF@CbfJgysfk(=Y3r/@э=V/\{  ʇJYR"J:,.QRx +F =!‰ru`\gw,YD -MȒ$9Se,i)DGLf*#Y'e586ɲ=bGDrNQ,M`J$=*"NvleϘQdDlc+HA#fO.Eü!'.fVҘ]pG`J]}Dà#|/H0%8;Ѝxe,. ̀~i?Gd DδT֖N!C|Ȍ0c?pJdcM3븐c\WcV]_Oih`fY3,`IҚ)fwK{J)ؒIg_!LvںP,#3u5.fP2TuL4𱗒y1gwUXkRB ]KxaJ@ZHF*)9J]N>E祖*fTҋ6<㘽_g} #ި%@ρbm8zqȋX@7XN}d8M+g_2\1 SgP1ta(Ah㬽kj"Iwz,_)_'GK4TKs( 1Z@f( `|8rkbn9H\}Ҕ`[Xfc=4yY9P*p-_P@Xi=di8hp#٢PBXwTS i}-E:2́1⨀].#@&'};đopmQ5GNǃlAi G.W8 qnǠ{RO6gq9 4j伷\ n{-ܜe%1{|f77<ݺV.gѸ(Fi{mvtfk9eTlk{.#;O})o5C$o l8ji&_&Ǐm}Hvc5o=xu>& V ]8kL?~S.cgk\|O;O^NϏ%?uw--z2:A)VN4<2ꔋge%L& Rs= _V>aMh*nMjdlEyjyަŧzԅ%o";jgëFb=۸@ s Wuc[-Qͺ3.Nz8z0A:@#I=DZ$]<X 8*?k˺7o}YySB7]wճJZj,(6@$trk rQR 0Mt?S~*r[eBWv7}޾'[ۏ-]?LoHo !n{d]cAp ;)V%F#I:xM 1vbhL.*^D qb^IRd I,RpJV)bbvjO.,k,ɇ Ql2b0*ZBCQUQ<1ZV<$rN=xvK iw-&],]ajY"~[} /^G=tc \yI&f@J$~ݘأ;fg'*pJhJ")[M:(*6.e&H+~1G~4 >JZYITzx Xƫĭ;8.S41$ #ijeCݻ5Aփ^wӠ([jSjk JXxR+JL2y)|䯖/l/#J^`/tːNSHgؓկ|kZ '7ˑ^p uTt\Qw b xZu׈?K}/Y|@?_.rfh<{~bOM K峩n?~v 'ÇL?v[xu kYf};.#Z"d{HZg1ȿPrg~n>pD.yzb-ˮ(R?Z}GO. ~>N z(bhn槿97OcGo/GY!lݳ sߜ'׊l Fǧs ]6@)2Ӿ LNұEz/Ә$䍋=jXorI7Rϙ*N?Osqvp:;;^~^WԳS#իWeڌpx[j8@o7 bwI%ed?RP6:c-^`|ԡhOKA d3k`(eLmɍv h2]Lz =~| 5yiOAUϽӿ5eLN +c"ji-gzSu3ɆBp掎#O2ĜE9מF'~-+'k[S>~ގsDSe39OfPs&Z܊V~NbNh+=hyQHjq0|Ah/dbxJlO/DS4|鬁@nN%/n(xݒÛ[q 'x>Og D=Z,[S%-ݪ #^$?.BK<MSYM>5E#('^{wؑtnY)&7hSUcwuz kjZ&Uv\w =TV%Gp2`RR)jIj{K!OG\(*Eo{mVf5uu=ry<Ԍ WHh>t=^t;u/ N\OCu,& 4) \Zk70i]o_i-`>dacXDf%F0|rh"^LI%H s2Y+J>uBQ̳Z7:-D^#샮%CV6XKVp.(޸KAk邙axbX 9%(NM,If) Ag^^z$B OĽC7Ϫ m4¦q>;k1e%9VSYV2S})A BsݺS4MѲLB ѲLLkaWZt{#1w0{#ڰ#yKAHebaBa(>r{2>I e482Mj%%A%7\ݾKR MR g"z# )!֠/'4A[,(M4F׉ 1j!le:5k(\$kx>pZق"Oqۣ k,R;*k7F X!G)$T&I9"T' }n$OR?7Wx55[H0t%3lʂI5e&ԑ{>=pW(JH`.17_c0 C H˼.sٌ|Qv6;9ME 4FҶ/V1y^?HyFhgdZ<= 9.@1w[S%]AOGRQՁ.t"IPxFAAdPOB'wkOOO,bL[A()e,%)zn#eȒ)ٕRM~iUpύh&D/ |q6*I&]Q4tH˳;$W_ dhd.̓uo짩Z$'\SO JI)~x率Q(Ո+p F8g`~ScBrFfIi_R,(Dr`DH2[ {QB;,\ͳguѐ.#drB`x& jkeRtl*&g$ĠMl,4JHRYv #%DDKUYjhs#7e2$"l"B(Ph+FلKhk'E0P$.6w''o[0 /Xf3vbMi4:W[w$ɡ"Xƅ*r@3VU_1ˢe^!.+݁^pԧbhkU5  aj  KY[; /Cw@e4 \(ژ%M-K%rj i !U@=4 7^Bk;zwomW%זr}pSM!Dg̓V/]?|1of:̿Z5a>dMF;-T~v {_ލ0C/WE1L`82wl8PIX_LIc-`hyw6Ix>lCJhgJIc0 +k԰j$Q *mzqJO V Դ,4 eaSY?kw9feGLi4@<:RW~~"-:Gٽ'AN{}4[y0Y`-`thF 5 N0[;=S{v _o9EDƣW3Njz~4W1ߛ*dO2@ Iq3Ш1Oab0dZDR9͕H3E@[ʦR)p[67D |8@jX=DA9$1&LqHfRG'3#Pl0)"EIAmn mD.oG13ӯ:)N"]E8PB,oz(m91]Qa7z5WrlSZjj_:!BD黡e:/fQ4iQaߎ^rQ,`"FlJzy bS.zPQ$XR,cxelItܶEE?]wCXr*!-dL.s[-[R2kK:,0:j6,VLL;XP#_nǸl ՐG͠3$dN.i3is/KZl:i.Iڈ݊=qvjl;]jܮ5dWrkX{gy8#H^3<8\gw(Hp0Z`%sD@ 0e-E0t跼 Wn=( M/j1&wKP;,i W)ڃ@֢:xpW@%*T{n6@Ć^Pvjp,@|:AXdaǮaR^Fs:Yh<[G9~} lH]~. BQRnzaM̬l~:n!`\SvY_5xb:U(r["p\p]`rBq`J:ٌ'$u [ |:֭ES'-Ӻ5o!NIs8NaBewԱn\EXޚu nM hvjy%;ִhOt 1Gdџ% ڐWO'ǻ,4d@s3-ޤR7\7~v_=ތ='s02@[YGV߫ݣ[n2Z*jT`7<_ceu\ZD DE◞)3UW6ApLSxѵ71{Ä Z*.rҀh!B"OT\ n׿\[I%fE-,^LldzGk#k(k&Ee-eyK'\-]L'+'JYYG ".xt>$A{ֆO}Q)NI8 W{8 A:|&?{7Qe 4N3 =!)4c"ILjmޒ"ͥaLL Dlkr|8vh}}F6ZrAH`xJ Fn@_7 ʞo-|@|1-?E6f ~>^u}FĂv-mWx<ӏ O;4[|Z/w"u> G$蚳~%k@&_F$ʃ w8!%< j8יXrMQR'@Bf01M9$4OrS4Fu6#Vh ly]X(tCu׽Fܤ_ӑ/ԖL}lۉYhNF>=9rk|#4-C%NSN&nҧVˮRu  ͟tIL1?rq'D9G(}OyJ=zeLf@S)N ʊT?e:$!}/vzW-H!A!5P#\dE49 > -+ENSa6hhFA^LXu>'F~eq!t;VGOM:x20IAQmuCS,~L`&8kIs.b X?2X}3[Abx',L O@bƜ.87U9¼w7C)VwiQ}SZH:^4JI!S30PQ2}/ 4z% K)VQ[ۘ*87Mb'Z(ӂ*&2 Kiìz"`ټgr?BȧY%S1}, dL"^XwC3YlL>a#o\ʢgh%* 2rJ2bJ3!1[(684Q&Ɖ"Q0ijꙻAQLmvDŽ=Z*xe|LaԴ]Q@=U al z)l7 $4@j!R<ȓTjLWyh=A3d{7W#񨰀 s0ui[0qqԎ:)dG;u)-e VG>pro섻|ߕwa e{ǎ ر>҂jNhF) |'gZaխ.w"t(Z..d*Ϻ,jYEs؉RuYTsF_]B(= eYR-R-LRJ T ̻lF)N Syzj;(\yxǨ٢ZvkFqDϡ_Q-)Pzi($gvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000004643607615145556413017726 0ustar rootrootFeb 19 08:44:56 crc systemd[1]: Starting Kubernetes Kubelet... Feb 19 08:44:56 crc restorecon[4683]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 08:44:56 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 19 08:44:57 crc restorecon[4683]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 19 08:44:58 crc kubenswrapper[4788]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 08:44:58 crc kubenswrapper[4788]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 19 08:44:58 crc kubenswrapper[4788]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 08:44:58 crc kubenswrapper[4788]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 08:44:58 crc kubenswrapper[4788]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 19 08:44:58 crc kubenswrapper[4788]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.462418 4788 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470522 4788 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470552 4788 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470561 4788 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470572 4788 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470583 4788 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470592 4788 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470601 4788 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470609 4788 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470619 4788 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470628 4788 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470636 4788 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470645 4788 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470652 4788 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470660 4788 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470667 4788 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470689 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470697 4788 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470705 4788 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470712 4788 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470720 4788 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470727 4788 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470735 4788 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470743 4788 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470751 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470758 4788 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470766 4788 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470777 4788 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470786 4788 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470795 4788 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470803 4788 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470811 4788 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470818 4788 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470826 4788 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470834 4788 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470842 4788 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470849 4788 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470857 4788 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470867 4788 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470874 4788 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470882 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470890 4788 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470897 4788 feature_gate.go:330] unrecognized feature gate: Example Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470906 4788 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470914 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470922 4788 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470929 4788 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470937 4788 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470944 4788 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470952 4788 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470960 4788 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470967 4788 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470974 4788 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470982 4788 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470989 4788 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.470997 4788 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471005 4788 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471013 4788 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471021 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471029 4788 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471036 4788 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471043 4788 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471051 4788 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471058 4788 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471071 4788 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471080 4788 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471088 4788 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471097 4788 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471106 4788 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471114 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471123 4788 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.471131 4788 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472135 4788 flags.go:64] FLAG: --address="0.0.0.0" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472157 4788 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472173 4788 flags.go:64] FLAG: --anonymous-auth="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472185 4788 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472196 4788 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472206 4788 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472219 4788 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472233 4788 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472270 4788 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472279 4788 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472289 4788 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472299 4788 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472308 4788 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472318 4788 flags.go:64] FLAG: --cgroup-root="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472326 4788 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472335 4788 flags.go:64] FLAG: --client-ca-file="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472344 4788 flags.go:64] FLAG: --cloud-config="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472354 4788 flags.go:64] FLAG: --cloud-provider="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472363 4788 flags.go:64] FLAG: --cluster-dns="[]" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472375 4788 flags.go:64] FLAG: --cluster-domain="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472383 4788 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472392 4788 flags.go:64] FLAG: --config-dir="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472400 4788 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472410 4788 flags.go:64] FLAG: --container-log-max-files="5" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472433 4788 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472441 4788 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472450 4788 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472460 4788 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472468 4788 flags.go:64] FLAG: --contention-profiling="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472478 4788 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472487 4788 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472496 4788 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472504 4788 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472515 4788 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472524 4788 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472533 4788 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472541 4788 flags.go:64] FLAG: --enable-load-reader="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472550 4788 flags.go:64] FLAG: --enable-server="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472559 4788 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472570 4788 flags.go:64] FLAG: --event-burst="100" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472579 4788 flags.go:64] FLAG: --event-qps="50" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472588 4788 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472596 4788 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472607 4788 flags.go:64] FLAG: --eviction-hard="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472617 4788 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472627 4788 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472636 4788 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472645 4788 flags.go:64] FLAG: --eviction-soft="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472653 4788 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472662 4788 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472673 4788 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472684 4788 flags.go:64] FLAG: --experimental-mounter-path="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472693 4788 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472702 4788 flags.go:64] FLAG: --fail-swap-on="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472710 4788 flags.go:64] FLAG: --feature-gates="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472721 4788 flags.go:64] FLAG: --file-check-frequency="20s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472730 4788 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472739 4788 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472748 4788 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472757 4788 flags.go:64] FLAG: --healthz-port="10248" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472765 4788 flags.go:64] FLAG: --help="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472775 4788 flags.go:64] FLAG: --hostname-override="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472783 4788 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472792 4788 flags.go:64] FLAG: --http-check-frequency="20s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472800 4788 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472809 4788 flags.go:64] FLAG: --image-credential-provider-config="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472818 4788 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472827 4788 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472835 4788 flags.go:64] FLAG: --image-service-endpoint="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472843 4788 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472852 4788 flags.go:64] FLAG: --kube-api-burst="100" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472861 4788 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472872 4788 flags.go:64] FLAG: --kube-api-qps="50" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472880 4788 flags.go:64] FLAG: --kube-reserved="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472889 4788 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472897 4788 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472907 4788 flags.go:64] FLAG: --kubelet-cgroups="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472915 4788 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472924 4788 flags.go:64] FLAG: --lock-file="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472933 4788 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472942 4788 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472951 4788 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472965 4788 flags.go:64] FLAG: --log-json-split-stream="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472974 4788 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472982 4788 flags.go:64] FLAG: --log-text-split-stream="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.472991 4788 flags.go:64] FLAG: --logging-format="text" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473000 4788 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473009 4788 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473018 4788 flags.go:64] FLAG: --manifest-url="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473027 4788 flags.go:64] FLAG: --manifest-url-header="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473038 4788 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473047 4788 flags.go:64] FLAG: --max-open-files="1000000" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473058 4788 flags.go:64] FLAG: --max-pods="110" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473067 4788 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473076 4788 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473084 4788 flags.go:64] FLAG: --memory-manager-policy="None" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473093 4788 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473102 4788 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473111 4788 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473120 4788 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473139 4788 flags.go:64] FLAG: --node-status-max-images="50" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473148 4788 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473158 4788 flags.go:64] FLAG: --oom-score-adj="-999" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473167 4788 flags.go:64] FLAG: --pod-cidr="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473176 4788 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473188 4788 flags.go:64] FLAG: --pod-manifest-path="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473197 4788 flags.go:64] FLAG: --pod-max-pids="-1" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473206 4788 flags.go:64] FLAG: --pods-per-core="0" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473214 4788 flags.go:64] FLAG: --port="10250" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473223 4788 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473232 4788 flags.go:64] FLAG: --provider-id="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473241 4788 flags.go:64] FLAG: --qos-reserved="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473273 4788 flags.go:64] FLAG: --read-only-port="10255" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473282 4788 flags.go:64] FLAG: --register-node="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473291 4788 flags.go:64] FLAG: --register-schedulable="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473301 4788 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473316 4788 flags.go:64] FLAG: --registry-burst="10" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473325 4788 flags.go:64] FLAG: --registry-qps="5" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473333 4788 flags.go:64] FLAG: --reserved-cpus="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473342 4788 flags.go:64] FLAG: --reserved-memory="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473352 4788 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473361 4788 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473370 4788 flags.go:64] FLAG: --rotate-certificates="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473379 4788 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473387 4788 flags.go:64] FLAG: --runonce="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473396 4788 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473405 4788 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473414 4788 flags.go:64] FLAG: --seccomp-default="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473423 4788 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473432 4788 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473441 4788 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473449 4788 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473458 4788 flags.go:64] FLAG: --storage-driver-password="root" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473467 4788 flags.go:64] FLAG: --storage-driver-secure="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473476 4788 flags.go:64] FLAG: --storage-driver-table="stats" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473485 4788 flags.go:64] FLAG: --storage-driver-user="root" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473494 4788 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473503 4788 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473513 4788 flags.go:64] FLAG: --system-cgroups="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473521 4788 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473535 4788 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473543 4788 flags.go:64] FLAG: --tls-cert-file="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473552 4788 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473563 4788 flags.go:64] FLAG: --tls-min-version="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473571 4788 flags.go:64] FLAG: --tls-private-key-file="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473580 4788 flags.go:64] FLAG: --topology-manager-policy="none" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473588 4788 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473597 4788 flags.go:64] FLAG: --topology-manager-scope="container" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473606 4788 flags.go:64] FLAG: --v="2" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473618 4788 flags.go:64] FLAG: --version="false" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473629 4788 flags.go:64] FLAG: --vmodule="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473640 4788 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.473649 4788 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473846 4788 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473857 4788 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473866 4788 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473874 4788 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473883 4788 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473890 4788 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473898 4788 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473906 4788 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473914 4788 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473921 4788 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473931 4788 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473943 4788 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473954 4788 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473963 4788 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473973 4788 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473983 4788 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.473991 4788 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474000 4788 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474007 4788 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474015 4788 feature_gate.go:330] unrecognized feature gate: Example Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474023 4788 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474030 4788 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474040 4788 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474050 4788 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474059 4788 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474067 4788 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474075 4788 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474083 4788 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474091 4788 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474098 4788 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474106 4788 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474113 4788 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474121 4788 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474129 4788 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474159 4788 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474167 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474175 4788 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474183 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474190 4788 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474198 4788 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474205 4788 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474213 4788 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474220 4788 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474235 4788 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474266 4788 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474274 4788 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474282 4788 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474289 4788 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474297 4788 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474304 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474312 4788 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474319 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474328 4788 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474336 4788 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474344 4788 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474351 4788 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474359 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474367 4788 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474374 4788 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474382 4788 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474389 4788 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474397 4788 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474405 4788 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474412 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474421 4788 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474429 4788 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474436 4788 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474444 4788 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474452 4788 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474459 4788 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.474468 4788 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.474491 4788 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.486563 4788 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.486632 4788 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486775 4788 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486798 4788 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486809 4788 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486819 4788 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486829 4788 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486840 4788 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486849 4788 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486858 4788 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486866 4788 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486875 4788 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486884 4788 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486896 4788 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486906 4788 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486915 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486924 4788 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486933 4788 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486942 4788 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486950 4788 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486958 4788 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486967 4788 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486975 4788 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486984 4788 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.486993 4788 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487001 4788 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487009 4788 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487018 4788 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487027 4788 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487040 4788 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487055 4788 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487066 4788 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487077 4788 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487088 4788 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487098 4788 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487107 4788 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487117 4788 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487126 4788 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487135 4788 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487143 4788 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487152 4788 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487160 4788 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487168 4788 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487177 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487185 4788 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487194 4788 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487203 4788 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487211 4788 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487219 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487229 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487237 4788 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487270 4788 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487279 4788 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487289 4788 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487297 4788 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487306 4788 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487314 4788 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487322 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487331 4788 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487342 4788 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487351 4788 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487359 4788 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487369 4788 feature_gate.go:330] unrecognized feature gate: Example Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487382 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487393 4788 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487404 4788 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487417 4788 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487428 4788 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487438 4788 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487447 4788 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487456 4788 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487464 4788 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487474 4788 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.487490 4788 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487739 4788 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487754 4788 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487766 4788 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487778 4788 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487790 4788 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487801 4788 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487810 4788 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487823 4788 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487831 4788 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487842 4788 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487851 4788 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487860 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487869 4788 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487877 4788 feature_gate.go:330] unrecognized feature gate: Example Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487885 4788 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487894 4788 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487902 4788 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487910 4788 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487922 4788 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487933 4788 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487941 4788 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487950 4788 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487959 4788 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487968 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487976 4788 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487984 4788 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.487994 4788 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488002 4788 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488011 4788 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488019 4788 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488028 4788 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488036 4788 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488045 4788 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488054 4788 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488087 4788 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488097 4788 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488108 4788 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488120 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488130 4788 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488140 4788 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488149 4788 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488158 4788 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488167 4788 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488176 4788 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488184 4788 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488193 4788 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488202 4788 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488210 4788 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488219 4788 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488227 4788 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488236 4788 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488288 4788 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488299 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488307 4788 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488316 4788 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488324 4788 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488332 4788 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488341 4788 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488350 4788 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488359 4788 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488367 4788 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488376 4788 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488384 4788 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488393 4788 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488401 4788 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488409 4788 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488418 4788 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488426 4788 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488435 4788 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488445 4788 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.488458 4788 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.488475 4788 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.488763 4788 server.go:940] "Client rotation is on, will bootstrap in background" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.495006 4788 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.495163 4788 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.497109 4788 server.go:997] "Starting client certificate rotation" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.497155 4788 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.497877 4788 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-11 10:26:10.915402377 +0000 UTC Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.498038 4788 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.531892 4788 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.534993 4788 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.535137 4788 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.556932 4788 log.go:25] "Validated CRI v1 runtime API" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.598116 4788 log.go:25] "Validated CRI v1 image API" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.600661 4788 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.606406 4788 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-19-08-40-33-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.606444 4788 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.624371 4788 manager.go:217] Machine: {Timestamp:2026-02-19 08:44:58.620570091 +0000 UTC m=+0.608581653 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:24e72cbc-0955-41b3-bfe8-a41d7b46c663 BootID:9f17f6ab-a8e6-46f8-93e5-a456adb8cae3 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:91:71:39 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:91:71:39 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:82:b0:20 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b5:97:14 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:1a:de:f4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:84:76:2c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:f6:22:63:4a:4e:a9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:62:6e:73:49:22:05 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.624787 4788 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.625037 4788 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.625420 4788 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.625621 4788 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.625661 4788 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.626780 4788 topology_manager.go:138] "Creating topology manager with none policy" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.626806 4788 container_manager_linux.go:303] "Creating device plugin manager" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.627301 4788 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.627331 4788 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.627528 4788 state_mem.go:36] "Initialized new in-memory state store" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.627645 4788 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.631010 4788 kubelet.go:418] "Attempting to sync node with API server" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.631042 4788 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.631065 4788 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.631084 4788 kubelet.go:324] "Adding apiserver pod source" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.631104 4788 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.635599 4788 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.636488 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.636671 4788 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.636670 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.636701 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.636831 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.638920 4788 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641042 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641063 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641070 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641078 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641088 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641096 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641104 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641114 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641122 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641130 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641139 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641145 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641162 4788 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641533 4788 server.go:1280] "Started kubelet" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.641606 4788 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.642578 4788 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 19 08:44:58 crc systemd[1]: Started Kubernetes Kubelet. Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.644115 4788 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.644626 4788 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.650394 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.652738 4788 server.go:460] "Adding debug handlers to kubelet server" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.652966 4788 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.653979 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 16:08:47.103073103 +0000 UTC Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.654039 4788 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.654066 4788 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.654174 4788 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.654174 4788 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18959967976f87ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 08:44:58.641508269 +0000 UTC m=+0.629519741,LastTimestamp:2026-02-19 08:44:58.641508269 +0000 UTC m=+0.629519741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.655965 4788 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.656327 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="200ms" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.656831 4788 factory.go:55] Registering systemd factory Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.656867 4788 factory.go:221] Registration of the systemd container factory successfully Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.657360 4788 factory.go:153] Registering CRI-O factory Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.657393 4788 factory.go:221] Registration of the crio container factory successfully Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.657476 4788 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.657442 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.657506 4788 factory.go:103] Registering Raw factory Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.657526 4788 manager.go:1196] Started watching for new ooms in manager Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.657528 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.658341 4788 manager.go:319] Starting recovery of all containers Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.673632 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.673869 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.673897 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.673954 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.673978 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.673999 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674057 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674080 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674143 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674169 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674223 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674294 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674323 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674385 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674418 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674474 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674498 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674565 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674591 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674613 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674670 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674694 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674749 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674772 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.674796 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.675937 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676014 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676097 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676135 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676190 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676215 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676285 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676320 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676380 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676409 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676433 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676493 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676518 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676578 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676602 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676659 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676688 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676715 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676794 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676858 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.676899 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677103 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677180 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677229 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677375 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677469 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677506 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677647 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677694 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677746 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677786 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677872 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677918 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677950 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.677985 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678021 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678044 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678092 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678115 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678148 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678186 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678213 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678308 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678332 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678366 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678395 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678418 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678445 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678468 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678492 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678516 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678553 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678576 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678600 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678634 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.678666 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.682911 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.683051 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.683112 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.683154 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.683178 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.683215 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.683313 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.684883 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.684949 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.684976 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.684998 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685017 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685038 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685062 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685091 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685119 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685143 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685163 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685185 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685204 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685226 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685274 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685295 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685344 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685403 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685429 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685453 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685477 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685499 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685520 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685541 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685561 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685584 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685604 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685624 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685645 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685665 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685684 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685706 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685726 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685746 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685768 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685795 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685816 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685836 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685856 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685875 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685896 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685915 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685962 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.685982 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686002 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686024 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686044 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686066 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686086 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686110 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686138 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686164 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686191 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686218 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686240 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686288 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686308 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686327 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686348 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686368 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686389 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686410 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686431 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686451 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686472 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686540 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686562 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686581 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686604 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686622 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686643 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686662 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686682 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686701 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686721 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686743 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686763 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686782 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686803 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686825 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686847 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686867 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686887 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686906 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686927 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686947 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686967 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.686987 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689406 4788 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689459 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689484 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689505 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689526 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689547 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689567 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689589 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689610 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689629 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689648 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689669 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689689 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689709 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689729 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689750 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689771 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689790 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689812 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689833 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689852 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689873 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689892 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689913 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689933 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689953 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689974 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.689994 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.690013 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.690033 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.690054 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.690076 4788 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.690100 4788 reconstruct.go:97] "Volume reconstruction finished" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.690114 4788 reconciler.go:26] "Reconciler: start to sync state" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.692495 4788 manager.go:324] Recovery completed Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.704797 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.707036 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.707085 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.707118 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.708966 4788 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.708984 4788 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.709001 4788 state_mem.go:36] "Initialized new in-memory state store" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.709859 4788 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.712991 4788 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.713052 4788 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.713097 4788 kubelet.go:2335] "Starting kubelet main sync loop" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.713164 4788 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 19 08:44:58 crc kubenswrapper[4788]: W0219 08:44:58.714172 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.714286 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.737636 4788 policy_none.go:49] "None policy: Start" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.738439 4788 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.738466 4788 state_mem.go:35] "Initializing new in-memory state store" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.755300 4788 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.795995 4788 manager.go:334] "Starting Device Plugin manager" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.796117 4788 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.796133 4788 server.go:79] "Starting device plugin registration server" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.796713 4788 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.796732 4788 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.797557 4788 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.797638 4788 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.797648 4788 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.806956 4788 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.813296 4788 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.813385 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.815170 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.815204 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.815215 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.815376 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.816008 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.816041 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.816373 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.816395 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.816404 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.816478 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.816643 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.816666 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817385 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817412 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817425 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817656 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817640 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817685 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817693 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817700 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817762 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.817978 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.818087 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.818144 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820288 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820307 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820314 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820410 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820695 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820766 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820908 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820953 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.820982 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821182 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821197 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821205 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821334 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821337 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821368 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821390 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821423 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821850 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821868 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.821875 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.856960 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="400ms" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.892864 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.892962 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893139 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893219 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893408 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893552 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893634 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893670 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893742 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893790 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893839 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.893924 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.894041 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.894646 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.895308 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.896893 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.898719 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.898762 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.898791 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.898817 4788 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 08:44:58 crc kubenswrapper[4788]: E0219 08:44:58.899639 4788 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997043 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997129 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997177 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997205 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997270 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997292 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997293 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997323 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997387 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997771 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997766 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997313 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997813 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997857 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997915 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997946 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997964 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997974 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998028 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.997973 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998090 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998158 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998225 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998342 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998361 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998409 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998427 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998449 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998412 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 08:44:58 crc kubenswrapper[4788]: I0219 08:44:58.998611 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.100108 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.101980 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.102055 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.102076 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.102115 4788 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 08:44:59 crc kubenswrapper[4788]: E0219 08:44:59.102781 4788 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.145602 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.153995 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.178449 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.198489 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.201002 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-639ba58571f08d654bcdc0a9618028f1ebb072f3b73571a65c59e8a811c96fcd WatchSource:0}: Error finding container 639ba58571f08d654bcdc0a9618028f1ebb072f3b73571a65c59e8a811c96fcd: Status 404 returned error can't find the container with id 639ba58571f08d654bcdc0a9618028f1ebb072f3b73571a65c59e8a811c96fcd Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.201803 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-29fd6b5bd836e56c5572499dd10cd11a647e189e9fb78a5977c525c2761fef16 WatchSource:0}: Error finding container 29fd6b5bd836e56c5572499dd10cd11a647e189e9fb78a5977c525c2761fef16: Status 404 returned error can't find the container with id 29fd6b5bd836e56c5572499dd10cd11a647e189e9fb78a5977c525c2761fef16 Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.206678 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.208939 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-525dfd46b5d4d721f6c58dcfc80bc71f53f4681c2154650f18efe55ae3992e74 WatchSource:0}: Error finding container 525dfd46b5d4d721f6c58dcfc80bc71f53f4681c2154650f18efe55ae3992e74: Status 404 returned error can't find the container with id 525dfd46b5d4d721f6c58dcfc80bc71f53f4681c2154650f18efe55ae3992e74 Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.220606 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-086afd376c3e8d7b05ba77a0613ae7e033c741c11b7bde62259ef550aaf361fd WatchSource:0}: Error finding container 086afd376c3e8d7b05ba77a0613ae7e033c741c11b7bde62259ef550aaf361fd: Status 404 returned error can't find the container with id 086afd376c3e8d7b05ba77a0613ae7e033c741c11b7bde62259ef550aaf361fd Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.243440 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f9c06da7963c8d7b5770c1b0136b5e519078964a41b088923bf0e8e793edb69f WatchSource:0}: Error finding container f9c06da7963c8d7b5770c1b0136b5e519078964a41b088923bf0e8e793edb69f: Status 404 returned error can't find the container with id f9c06da7963c8d7b5770c1b0136b5e519078964a41b088923bf0e8e793edb69f Feb 19 08:44:59 crc kubenswrapper[4788]: E0219 08:44:59.257749 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="800ms" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.503041 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.504824 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.504856 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.504867 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.504889 4788 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 08:44:59 crc kubenswrapper[4788]: E0219 08:44:59.505380 4788 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.520621 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:59 crc kubenswrapper[4788]: E0219 08:44:59.520704 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.623585 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:59 crc kubenswrapper[4788]: E0219 08:44:59.623691 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.645003 4788 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.650541 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:59 crc kubenswrapper[4788]: E0219 08:44:59.650626 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.655035 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 20:57:23.436417231 +0000 UTC Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.717065 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f9c06da7963c8d7b5770c1b0136b5e519078964a41b088923bf0e8e793edb69f"} Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.717839 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"086afd376c3e8d7b05ba77a0613ae7e033c741c11b7bde62259ef550aaf361fd"} Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.718637 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"525dfd46b5d4d721f6c58dcfc80bc71f53f4681c2154650f18efe55ae3992e74"} Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.719554 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"639ba58571f08d654bcdc0a9618028f1ebb072f3b73571a65c59e8a811c96fcd"} Feb 19 08:44:59 crc kubenswrapper[4788]: I0219 08:44:59.720587 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29fd6b5bd836e56c5572499dd10cd11a647e189e9fb78a5977c525c2761fef16"} Feb 19 08:44:59 crc kubenswrapper[4788]: W0219 08:44:59.838165 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:44:59 crc kubenswrapper[4788]: E0219 08:44:59.838239 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:45:00 crc kubenswrapper[4788]: E0219 08:45:00.058351 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="1.6s" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.306452 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.308101 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.308156 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.308174 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.308212 4788 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 08:45:00 crc kubenswrapper[4788]: E0219 08:45:00.308794 4788 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.554645 4788 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 08:45:00 crc kubenswrapper[4788]: E0219 08:45:00.555775 4788 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.645869 4788 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.656037 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 08:07:18.151553518 +0000 UTC Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.724641 4788 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742" exitCode=0 Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.724715 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742"} Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.724802 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.725838 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.725877 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.725893 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.727750 4788 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="eb73dcf48a8273c866c1ccb092879cd5acf4b1f98ac72af914f6fdbe9140163f" exitCode=0 Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.727882 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"eb73dcf48a8273c866c1ccb092879cd5acf4b1f98ac72af914f6fdbe9140163f"} Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.728013 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.730978 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.731014 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.731025 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.731887 4788 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec" exitCode=0 Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.731953 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec"} Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.732023 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.733008 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.733051 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.733068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.735139 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.735317 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0"} Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.735339 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753"} Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.735352 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f"} Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.735362 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57"} Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.736352 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.736404 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.736422 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.737303 4788 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69" exitCode=0 Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.737346 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69"} Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.737416 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.738694 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.738733 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.738756 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.740496 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.744169 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.744210 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:00 crc kubenswrapper[4788]: I0219 08:45:00.744229 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:01 crc kubenswrapper[4788]: E0219 08:45:01.208120 4788 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.18959967976f87ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 08:44:58.641508269 +0000 UTC m=+0.629519741,LastTimestamp:2026-02-19 08:44:58.641508269 +0000 UTC m=+0.629519741,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.645784 4788 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.657271 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 00:01:24.35823262 +0000 UTC Feb 19 08:45:01 crc kubenswrapper[4788]: E0219 08:45:01.659355 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="3.2s" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.745807 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.745880 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.745897 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.745911 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.745928 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.748473 4788 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248" exitCode=0 Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.748532 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.748595 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.749534 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.749566 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.749576 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.751665 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"651cb8236ffe56a187d9a013202e50d1ead62f85058c3e039ab897f2389b813d"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.751759 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.753886 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.753942 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.753957 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.760451 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"55e5428237ae5741c34aae6754486baedada4ba34c8d5134ea392c6d54698836"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.760519 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eefb227dd2add39a5e4c83674ebe19e375697257f8bc8c29a15e951e49650be4"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.760535 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"68431cd65fb593de62afcc19c22d4e3f8d8da669e9a1fa22c820abbbf27585a0"} Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.760550 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.761316 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.761580 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.761624 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.761637 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.763047 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.763075 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.763083 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.876653 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.909824 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.911282 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.911332 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.911341 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:01 crc kubenswrapper[4788]: I0219 08:45:01.911376 4788 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 08:45:01 crc kubenswrapper[4788]: E0219 08:45:01.911776 4788 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.169:6443: connect: connection refused" node="crc" Feb 19 08:45:02 crc kubenswrapper[4788]: W0219 08:45:02.021629 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:45:02 crc kubenswrapper[4788]: E0219 08:45:02.021757 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:45:02 crc kubenswrapper[4788]: W0219 08:45:02.157167 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:45:02 crc kubenswrapper[4788]: E0219 08:45:02.157316 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:45:02 crc kubenswrapper[4788]: W0219 08:45:02.194572 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.169:6443: connect: connection refused Feb 19 08:45:02 crc kubenswrapper[4788]: E0219 08:45:02.194676 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.169:6443: connect: connection refused" logger="UnhandledError" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.657888 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 06:05:10.800157479 +0000 UTC Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.767617 4788 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526" exitCode=0 Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.767862 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.767898 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526"} Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.767951 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.767862 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.767863 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.768037 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.767966 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770374 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770400 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770377 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770444 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770417 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770473 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770480 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770524 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770567 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770585 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770459 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770690 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.770392 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.771332 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:02 crc kubenswrapper[4788]: I0219 08:45:02.771453 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.658747 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 20:08:53.573801563 +0000 UTC Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.775655 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e"} Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.775736 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7"} Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.775754 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.775815 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.775698 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.775762 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6"} Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.775999 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00"} Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.776934 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.776947 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.776969 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.776982 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.776983 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:03 crc kubenswrapper[4788]: I0219 08:45:03.777006 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.289826 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.296670 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.296900 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.298202 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.298303 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.298326 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.659239 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 15:14:20.027328645 +0000 UTC Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.783448 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5"} Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.783495 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.783531 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.783576 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.785159 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.785195 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.785205 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.785348 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.785400 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.785425 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:04 crc kubenswrapper[4788]: I0219 08:45:04.929135 4788 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.111958 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.113674 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.113721 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.113730 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.113754 4788 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.659429 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 12:37:02.624026401 +0000 UTC Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.708701 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.708877 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.710506 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.710562 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.710585 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.786679 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.788113 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.788204 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.788223 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.936459 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.936782 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.936871 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.939687 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.939737 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:05 crc kubenswrapper[4788]: I0219 08:45:05.939752 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:06 crc kubenswrapper[4788]: I0219 08:45:06.659780 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 21:53:07.589969888 +0000 UTC Feb 19 08:45:07 crc kubenswrapper[4788]: I0219 08:45:07.297385 4788 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 08:45:07 crc kubenswrapper[4788]: I0219 08:45:07.297452 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 08:45:07 crc kubenswrapper[4788]: I0219 08:45:07.359324 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 19 08:45:07 crc kubenswrapper[4788]: I0219 08:45:07.359465 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:07 crc kubenswrapper[4788]: I0219 08:45:07.360627 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:07 crc kubenswrapper[4788]: I0219 08:45:07.360855 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:07 crc kubenswrapper[4788]: I0219 08:45:07.360873 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:07 crc kubenswrapper[4788]: I0219 08:45:07.660281 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 21:40:28.648445703 +0000 UTC Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.277867 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.278025 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.279351 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.279409 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.279425 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.283609 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.661353 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 20:37:58.887950141 +0000 UTC Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.794459 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.795653 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.795699 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.795716 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:08 crc kubenswrapper[4788]: E0219 08:45:08.807112 4788 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.853323 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.853573 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.854682 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.854710 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.854722 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.873655 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.873896 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.875349 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.875409 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:08 crc kubenswrapper[4788]: I0219 08:45:08.875438 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:09 crc kubenswrapper[4788]: I0219 08:45:09.662156 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 01:56:47.043393095 +0000 UTC Feb 19 08:45:10 crc kubenswrapper[4788]: I0219 08:45:10.662702 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 12:32:30.439544459 +0000 UTC Feb 19 08:45:11 crc kubenswrapper[4788]: I0219 08:45:11.663348 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 20:05:26.512265564 +0000 UTC Feb 19 08:45:12 crc kubenswrapper[4788]: W0219 08:45:12.318803 4788 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.319127 4788 trace.go:236] Trace[1654086055]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 08:45:02.317) (total time: 10001ms): Feb 19 08:45:12 crc kubenswrapper[4788]: Trace[1654086055]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:45:12.318) Feb 19 08:45:12 crc kubenswrapper[4788]: Trace[1654086055]: [10.001623836s] [10.001623836s] END Feb 19 08:45:12 crc kubenswrapper[4788]: E0219 08:45:12.319271 4788 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.646009 4788 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.663647 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 22:01:46.266215268 +0000 UTC Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.807789 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.810412 4788 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23" exitCode=255 Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.810463 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23"} Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.810590 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.811454 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.811509 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.811526 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:12 crc kubenswrapper[4788]: I0219 08:45:12.812316 4788 scope.go:117] "RemoveContainer" containerID="d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23" Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.073790 4788 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.073853 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.086558 4788 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.086620 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.664786 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 08:36:40.594313133 +0000 UTC Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.817394 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.820296 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73"} Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.820521 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.821788 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.821839 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:13 crc kubenswrapper[4788]: I0219 08:45:13.821857 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:14 crc kubenswrapper[4788]: I0219 08:45:14.298067 4788 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]log ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]etcd ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/generic-apiserver-start-informers ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/priority-and-fairness-filter ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-apiextensions-informers ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-apiextensions-controllers ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/crd-informer-synced ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-system-namespaces-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 19 08:45:14 crc kubenswrapper[4788]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/bootstrap-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/start-kube-aggregator-informers ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/apiservice-registration-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/apiservice-discovery-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]autoregister-completion ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/apiservice-openapi-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 19 08:45:14 crc kubenswrapper[4788]: livez check failed Feb 19 08:45:14 crc kubenswrapper[4788]: I0219 08:45:14.298141 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:45:14 crc kubenswrapper[4788]: I0219 08:45:14.665339 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:17:02.189273223 +0000 UTC Feb 19 08:45:15 crc kubenswrapper[4788]: I0219 08:45:15.665619 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 22:52:07.412982388 +0000 UTC Feb 19 08:45:15 crc kubenswrapper[4788]: I0219 08:45:15.715785 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:45:15 crc kubenswrapper[4788]: I0219 08:45:15.715990 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:15 crc kubenswrapper[4788]: I0219 08:45:15.717606 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:15 crc kubenswrapper[4788]: I0219 08:45:15.717662 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:15 crc kubenswrapper[4788]: I0219 08:45:15.717687 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:16 crc kubenswrapper[4788]: I0219 08:45:16.666753 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 00:36:00.402159378 +0000 UTC Feb 19 08:45:17 crc kubenswrapper[4788]: I0219 08:45:17.297398 4788 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 08:45:17 crc kubenswrapper[4788]: I0219 08:45:17.297476 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 08:45:17 crc kubenswrapper[4788]: I0219 08:45:17.666875 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 18:02:09.418591347 +0000 UTC Feb 19 08:45:17 crc kubenswrapper[4788]: I0219 08:45:17.879970 4788 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.063648 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.066897 4788 trace.go:236] Trace[490084090]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 08:45:07.329) (total time: 10737ms): Feb 19 08:45:18 crc kubenswrapper[4788]: Trace[490084090]: ---"Objects listed" error: 10737ms (08:45:18.066) Feb 19 08:45:18 crc kubenswrapper[4788]: Trace[490084090]: [10.737240734s] [10.737240734s] END Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.066947 4788 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.070353 4788 trace.go:236] Trace[1405550058]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 08:45:07.897) (total time: 10172ms): Feb 19 08:45:18 crc kubenswrapper[4788]: Trace[1405550058]: ---"Objects listed" error: 10172ms (08:45:18.070) Feb 19 08:45:18 crc kubenswrapper[4788]: Trace[1405550058]: [10.172356011s] [10.172356011s] END Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.070381 4788 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.072351 4788 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.072546 4788 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.072596 4788 trace.go:236] Trace[829794261]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (19-Feb-2026 08:45:07.351) (total time: 10721ms): Feb 19 08:45:18 crc kubenswrapper[4788]: Trace[829794261]: ---"Objects listed" error: 10721ms (08:45:18.072) Feb 19 08:45:18 crc kubenswrapper[4788]: Trace[829794261]: [10.721292058s] [10.721292058s] END Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.072632 4788 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.079092 4788 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.112699 4788 csr.go:261] certificate signing request csr-b8pft is approved, waiting to be issued Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.122205 4788 csr.go:257] certificate signing request csr-b8pft is issued Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.497359 4788 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.497566 4788 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.169:38354->38.102.83.169:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18959967ba330a52 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 08:44:59.224746578 +0000 UTC m=+1.212758060,LastTimestamp:2026-02-19 08:44:59.224746578 +0000 UTC m=+1.212758060,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 08:45:18 crc kubenswrapper[4788]: W0219 08:45:18.497658 4788 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 08:45:18 crc kubenswrapper[4788]: W0219 08:45:18.497677 4788 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 08:45:18 crc kubenswrapper[4788]: W0219 08:45:18.497688 4788 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 08:45:18 crc kubenswrapper[4788]: W0219 08:45:18.497712 4788 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.645758 4788 apiserver.go:52] "Watching apiserver" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.661893 4788 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.662182 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-rfl2j","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.662620 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.662980 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.663013 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rfl2j" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.662731 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.663033 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.662664 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.662821 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.663386 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.663484 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.663444 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.665104 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.665193 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.665576 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668485 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 21:21:22.622682508 +0000 UTC Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668547 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668660 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668672 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668735 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668795 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668810 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668818 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.668918 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.669244 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.686648 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.698859 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.708359 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.718775 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.732198 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.743134 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.754079 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.757110 4788 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.760067 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.767442 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.775339 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776601 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776647 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776671 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776697 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776717 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776738 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776761 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776785 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776833 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776859 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776880 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776901 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776924 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776945 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776970 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776991 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777012 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777037 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777058 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777080 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777099 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777120 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777143 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777164 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777186 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777207 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777226 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777269 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777291 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777310 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777330 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777349 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777370 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777393 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777422 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777445 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777465 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777484 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777504 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777522 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777545 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777566 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777588 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777608 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777627 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777648 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777668 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777687 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777709 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777733 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777757 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777778 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777800 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777823 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777844 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777865 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777890 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777913 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777933 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777954 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777974 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777996 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778016 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778036 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778056 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778081 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778101 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778161 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778205 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778227 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778266 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778287 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778308 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778331 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778355 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778378 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778400 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778421 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778443 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778464 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778483 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778501 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778525 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778549 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778566 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778581 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778600 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778615 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778633 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778649 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778668 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778684 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778700 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778714 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778730 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778746 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778762 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778780 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778797 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778813 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778827 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778842 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778858 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778873 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778888 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778905 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778927 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778949 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778965 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778982 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778997 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779012 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779029 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779052 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779069 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779085 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779101 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779121 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779160 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779181 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779199 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779216 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779268 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779287 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779306 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779324 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779343 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779360 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779378 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779398 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779413 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779429 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779445 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779461 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779476 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779491 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779511 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779528 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779544 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779561 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779576 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779594 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779609 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779626 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779649 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779675 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779698 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779714 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779731 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779749 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779767 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779782 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779798 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779814 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.776942 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779846 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779929 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779949 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779949 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779965 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779997 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780002 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777686 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777685 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777710 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777815 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777928 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780050 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780087 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780118 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780148 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780176 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780203 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780228 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780274 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780305 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780333 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780357 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780381 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780403 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780429 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780455 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780483 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780507 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780531 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780555 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780581 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780608 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780631 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780656 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780679 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780731 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780760 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780783 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780807 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780830 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780853 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780875 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780902 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780926 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780949 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780974 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780998 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781022 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781045 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781096 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781130 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6-hosts-file\") pod \"node-resolver-rfl2j\" (UID: \"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\") " pod="openshift-dns/node-resolver-rfl2j" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781160 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781188 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781216 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781247 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781289 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkp2d\" (UniqueName: \"kubernetes.io/projected/a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6-kube-api-access-fkp2d\") pod \"node-resolver-rfl2j\" (UID: \"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\") " pod="openshift-dns/node-resolver-rfl2j" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781321 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781345 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781374 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781403 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781426 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781449 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781474 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781496 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781523 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781588 4788 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781605 4788 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781620 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781636 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781650 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781663 4788 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781677 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781691 4788 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781705 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783516 4788 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.784125 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.788796 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.791914 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.794133 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.795153 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.800286 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777954 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778085 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778297 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778355 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778411 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778507 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778610 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778636 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778754 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.778867 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779004 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779156 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779158 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779323 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779459 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779491 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779536 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779579 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779704 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779713 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.779826 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780138 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780297 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780284 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780429 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780536 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780593 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780614 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780721 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.780930 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781077 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781083 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781151 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781270 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.802958 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781481 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781670 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.777377 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.781967 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.782073 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.782161 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.782172 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.782202 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.782654 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.782965 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783012 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783069 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783142 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.782630 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783329 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783335 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783473 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783686 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783810 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783811 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.783963 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.784138 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.784239 4788 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.784281 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.784568 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.784713 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.784899 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785160 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785188 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785317 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785446 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785465 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785479 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785609 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785622 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785683 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.785988 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786289 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786333 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786368 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786409 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786583 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786662 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786813 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786847 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786868 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786896 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.786905 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.787288 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.787293 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.787497 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.787729 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.787817 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.788089 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.788186 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.788492 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.788531 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.788547 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.788941 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.788965 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.789208 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.790419 4788 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.793018 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.793051 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.793102 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.793356 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.793412 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.793566 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.793589 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.795577 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.798497 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.798816 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.799229 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.799908 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.802423 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.802608 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.803272 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.803458 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.803507 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.803605 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.803757 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.803834 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.804597 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.805245 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.805396 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.805420 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.805468 4788 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.805581 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.805803 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.805877 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.805894 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.805954 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.806004 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:19.305960774 +0000 UTC m=+21.293972356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.806046 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.806445 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.806656 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.806828 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:19.306807573 +0000 UTC m=+21.294819155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.806923 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.806950 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.807034 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:19.307013668 +0000 UTC m=+21.295025260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.807310 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.807406 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.807437 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.807534 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.807789 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.807914 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.807933 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.807946 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.808116 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.808549 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.808834 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.808950 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.809006 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.809103 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.809145 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:45:19.309125777 +0000 UTC m=+21.297137359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.809622 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.809698 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.809845 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.810135 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.810546 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.812089 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.812432 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.812679 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.812771 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.813001 4788 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:18 crc kubenswrapper[4788]: E0219 08:45:18.814192 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:19.314168172 +0000 UTC m=+21.302179754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.814318 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.815757 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.816168 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.816772 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.817634 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.819659 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.820095 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.820326 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.820319 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.821124 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.821157 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.821311 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.821909 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.821939 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.822435 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.824577 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.824606 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.824686 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.824925 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.825042 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.825466 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.825539 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.825563 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.825502 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.826793 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.827170 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.827386 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.828007 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.828229 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.828304 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.828353 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.828668 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.828811 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.830185 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.830408 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.831119 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.835523 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.845932 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.854476 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.874491 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884623 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884669 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6-hosts-file\") pod \"node-resolver-rfl2j\" (UID: \"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\") " pod="openshift-dns/node-resolver-rfl2j" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884688 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkp2d\" (UniqueName: \"kubernetes.io/projected/a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6-kube-api-access-fkp2d\") pod \"node-resolver-rfl2j\" (UID: \"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\") " pod="openshift-dns/node-resolver-rfl2j" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884718 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884758 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884768 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884776 4788 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884786 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884794 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884802 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884809 4788 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884817 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884825 4788 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884904 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884912 4788 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884920 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884928 4788 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884936 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884943 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884950 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884958 4788 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884966 4788 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884973 4788 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884981 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884989 4788 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.884997 4788 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885005 4788 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885013 4788 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885021 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885029 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885038 4788 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885046 4788 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885055 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885063 4788 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885070 4788 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885078 4788 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885086 4788 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885093 4788 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885100 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885108 4788 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885116 4788 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885125 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885133 4788 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885141 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885150 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885157 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885165 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885173 4788 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885181 4788 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885188 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885196 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885203 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885212 4788 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885220 4788 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885228 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885235 4788 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885296 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885304 4788 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885314 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.885396 4788 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886065 4788 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886078 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886086 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886094 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886102 4788 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886110 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886119 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886126 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886134 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886142 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886150 4788 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886157 4788 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886164 4788 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886173 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886182 4788 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886189 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886197 4788 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886204 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886203 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6-hosts-file\") pod \"node-resolver-rfl2j\" (UID: \"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\") " pod="openshift-dns/node-resolver-rfl2j" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886221 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886282 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886303 4788 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886315 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886327 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886337 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886346 4788 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886354 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886363 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886371 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886391 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886399 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886407 4788 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886415 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886424 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886432 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886441 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886448 4788 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886457 4788 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886465 4788 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886473 4788 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886482 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886491 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886500 4788 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886508 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886515 4788 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886525 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886535 4788 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886545 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886554 4788 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886563 4788 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886572 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886581 4788 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886589 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886597 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886605 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886612 4788 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886620 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886628 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886636 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886644 4788 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886651 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886659 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886668 4788 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886676 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886684 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886692 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886700 4788 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886708 4788 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886716 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886735 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886743 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886751 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886759 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886767 4788 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886774 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886783 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886790 4788 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886798 4788 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886807 4788 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886815 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886823 4788 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886832 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886839 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886857 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886865 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886872 4788 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886880 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886888 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886900 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886909 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886917 4788 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886925 4788 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886932 4788 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886940 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886950 4788 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886958 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886966 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886975 4788 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886983 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886992 4788 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886999 4788 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887007 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887017 4788 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887025 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887032 4788 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887040 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887048 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887056 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887063 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887072 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887080 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887088 4788 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887096 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887104 4788 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887112 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887119 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887128 4788 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887136 4788 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887144 4788 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887153 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887161 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887168 4788 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887177 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887184 4788 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887192 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887200 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.887207 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.886551 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.899876 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.902534 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.904772 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.908785 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkp2d\" (UniqueName: \"kubernetes.io/projected/a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6-kube-api-access-fkp2d\") pod \"node-resolver-rfl2j\" (UID: \"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\") " pod="openshift-dns/node-resolver-rfl2j" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.942351 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.951050 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.955556 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.958736 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.973536 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.976689 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.982225 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.988027 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.988123 4788 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.988145 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.988158 4788 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:18 crc kubenswrapper[4788]: I0219 08:45:18.993184 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 19 08:45:18 crc kubenswrapper[4788]: W0219 08:45:18.993504 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-878cb6347dfbd8dfabc8c57b7eb84a4934a766fb0c9bcc706f8d5451cb609a32 WatchSource:0}: Error finding container 878cb6347dfbd8dfabc8c57b7eb84a4934a766fb0c9bcc706f8d5451cb609a32: Status 404 returned error can't find the container with id 878cb6347dfbd8dfabc8c57b7eb84a4934a766fb0c9bcc706f8d5451cb609a32 Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.003635 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.005165 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.007814 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.012765 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rfl2j" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.015292 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.037998 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: W0219 08:45:19.046180 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8f997695f1a248e6fcce5f0ccd679b6ff5b9061c67a7ef927db4e00aadc41263 WatchSource:0}: Error finding container 8f997695f1a248e6fcce5f0ccd679b6ff5b9061c67a7ef927db4e00aadc41263: Status 404 returned error can't find the container with id 8f997695f1a248e6fcce5f0ccd679b6ff5b9061c67a7ef927db4e00aadc41263 Feb 19 08:45:19 crc kubenswrapper[4788]: W0219 08:45:19.046775 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3d9c112_0fa4_4ad8_84f9_0eb7dd4d92f6.slice/crio-314482618f71fdad57f491fcb416e672e9f3887726d3381ee8aef58301a7b7b8 WatchSource:0}: Error finding container 314482618f71fdad57f491fcb416e672e9f3887726d3381ee8aef58301a7b7b8: Status 404 returned error can't find the container with id 314482618f71fdad57f491fcb416e672e9f3887726d3381ee8aef58301a7b7b8 Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.049944 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.068362 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.069441 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.081486 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.089424 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.090210 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.097193 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.114833 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.123968 4788 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-19 08:40:18 +0000 UTC, rotation deadline is 2026-11-29 20:51:31.669614237 +0000 UTC Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.124020 4788 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6804h6m12.545596344s for next certificate rotation Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.125226 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.141358 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.149420 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.157537 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.293972 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.302682 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.313110 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.321322 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.333100 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.355708 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.372986 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.383192 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.391367 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.391441 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.391487 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.391503 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391583 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:45:20.391555752 +0000 UTC m=+22.379567224 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391599 4788 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391653 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:20.391638644 +0000 UTC m=+22.379650116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.391692 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391706 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391721 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391732 4788 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391765 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:20.391757887 +0000 UTC m=+22.379769359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391795 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391813 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391826 4788 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391851 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:20.391844669 +0000 UTC m=+22.379856141 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391895 4788 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.391922 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:20.39191379 +0000 UTC m=+22.379925382 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.397981 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.411720 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.668722 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 07:41:40.880593645 +0000 UTC Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.843042 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3"} Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.843280 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447"} Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.843291 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8f997695f1a248e6fcce5f0ccd679b6ff5b9061c67a7ef927db4e00aadc41263"} Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.844937 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090"} Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.844997 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"878cb6347dfbd8dfabc8c57b7eb84a4934a766fb0c9bcc706f8d5451cb609a32"} Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.846790 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rfl2j" event={"ID":"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6","Type":"ContainerStarted","Data":"3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b"} Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.846851 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rfl2j" event={"ID":"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6","Type":"ContainerStarted","Data":"314482618f71fdad57f491fcb416e672e9f3887726d3381ee8aef58301a7b7b8"} Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.848014 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ff6fc7a2e55e571373b299c616fff5198b0358fd7c97840a74671cc1cba0daf3"} Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.852764 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:45:19 crc kubenswrapper[4788]: E0219 08:45:19.858790 4788 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.871918 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.885408 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.900496 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.911995 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.929620 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.944335 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.959936 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.973554 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:19 crc kubenswrapper[4788]: I0219 08:45:19.991047 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.010483 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.023564 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.037078 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.050333 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.063098 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.085020 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.096926 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.108936 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.121097 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.400302 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.400387 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400401 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:45:22.400381627 +0000 UTC m=+24.388393099 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.400419 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.400442 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.400468 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400534 4788 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400563 4788 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400571 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400659 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400671 4788 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400574 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400703 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400710 4788 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400585 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:22.400575102 +0000 UTC m=+24.388586574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400751 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:22.400734335 +0000 UTC m=+24.388745887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400765 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:22.400758606 +0000 UTC m=+24.388770188 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.400777 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:22.400771146 +0000 UTC m=+24.388782718 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.668862 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 03:52:10.696283281 +0000 UTC Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.713649 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.713695 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.713725 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.713768 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.713819 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:20 crc kubenswrapper[4788]: E0219 08:45:20.713869 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.718674 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.719392 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.720910 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.721690 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.722866 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.723567 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.724311 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.725181 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.725917 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.727095 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.727714 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.729073 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.729757 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.730440 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.731757 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.732409 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.733667 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.734152 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.734862 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.736015 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.736666 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.737742 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.738156 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.739295 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.739735 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.740416 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.741558 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.741995 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.742905 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.743387 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.744185 4788 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.744310 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.745866 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.746696 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.747081 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.748632 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.749219 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.750179 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.750846 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.752033 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.752504 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.753552 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.754229 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.755405 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.755934 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.756860 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.757358 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.758453 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.758929 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.759931 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.760386 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.761222 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.761767 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.762204 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.995936 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-tftzx"] Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.996196 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9hxf6"] Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.996394 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.996405 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9hxf6" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.998570 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.998631 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.999236 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.999314 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 08:45:20 crc kubenswrapper[4788]: I0219 08:45:20.999544 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.000121 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7s4rp"] Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.000144 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.000259 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.000304 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.000332 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.000694 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.002461 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.003116 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.003416 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.021768 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.040479 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.052702 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.060425 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.072449 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.082238 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.091217 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.100532 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105576 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-system-cni-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105621 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-cni-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105645 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj598\" (UniqueName: \"kubernetes.io/projected/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-kube-api-access-jj598\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105670 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105714 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2c07881f-4511-4cd1-9283-6891826b57a1-rootfs\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105736 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-os-release\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105756 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5c26787-29de-439a-86b8-920cac6c8ab8-cni-binary-copy\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105778 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105801 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-netns\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105832 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-system-cni-dir\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105852 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-cnibin\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105873 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-conf-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105893 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lmj7\" (UniqueName: \"kubernetes.io/projected/a5c26787-29de-439a-86b8-920cac6c8ab8-kube-api-access-9lmj7\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105923 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-os-release\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105949 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-cni-multus\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.105972 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-daemon-config\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106041 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07881f-4511-4cd1-9283-6891826b57a1-proxy-tls\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106085 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-socket-dir-parent\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106107 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-hostroot\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106134 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-multus-certs\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106150 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2c07881f-4511-4cd1-9283-6891826b57a1-mcd-auth-proxy-config\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106170 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106187 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-kubelet\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106266 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8cn\" (UniqueName: \"kubernetes.io/projected/2c07881f-4511-4cd1-9283-6891826b57a1-kube-api-access-zk8cn\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106286 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-k8s-cni-cncf-io\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106300 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-cni-bin\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106329 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-etc-kubernetes\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.106384 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cnibin\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.110031 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.120992 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.131554 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.142159 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.151891 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.166127 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.178281 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208072 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-os-release\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208147 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-cni-multus\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208180 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-daemon-config\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208224 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07881f-4511-4cd1-9283-6891826b57a1-proxy-tls\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208277 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-socket-dir-parent\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208337 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-hostroot\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208328 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-cni-multus\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208364 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-multus-certs\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208466 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208496 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-kubelet\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208500 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-hostroot\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208504 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-socket-dir-parent\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208519 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2c07881f-4511-4cd1-9283-6891826b57a1-mcd-auth-proxy-config\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208634 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-os-release\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208641 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8cn\" (UniqueName: \"kubernetes.io/projected/2c07881f-4511-4cd1-9283-6891826b57a1-kube-api-access-zk8cn\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208748 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-k8s-cni-cncf-io\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208789 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-cni-bin\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208831 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-etc-kubernetes\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208873 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cnibin\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208910 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-system-cni-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208943 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-cni-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208962 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-cni-bin\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208577 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-var-lib-kubelet\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.208979 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj598\" (UniqueName: \"kubernetes.io/projected/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-kube-api-access-jj598\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209020 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209075 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2c07881f-4511-4cd1-9283-6891826b57a1-rootfs\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209075 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-daemon-config\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209098 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-os-release\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209119 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5c26787-29de-439a-86b8-920cac6c8ab8-cni-binary-copy\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209141 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209158 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-os-release\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209074 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-k8s-cni-cncf-io\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209166 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-netns\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209189 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-netns\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209197 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-system-cni-dir\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209213 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-cnibin\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209226 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-conf-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209246 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lmj7\" (UniqueName: \"kubernetes.io/projected/a5c26787-29de-439a-86b8-920cac6c8ab8-kube-api-access-9lmj7\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209378 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2c07881f-4511-4cd1-9283-6891826b57a1-mcd-auth-proxy-config\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209397 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-etc-kubernetes\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209121 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2c07881f-4511-4cd1-9283-6891826b57a1-rootfs\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209443 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-cnibin\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209441 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-system-cni-dir\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209480 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-conf-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209479 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-system-cni-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209512 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cnibin\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209585 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-multus-cni-dir\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209663 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cni-binary-copy\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209752 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a5c26787-29de-439a-86b8-920cac6c8ab8-cni-binary-copy\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209767 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a5c26787-29de-439a-86b8-920cac6c8ab8-host-run-multus-certs\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209872 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.209897 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.212983 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2c07881f-4511-4cd1-9283-6891826b57a1-proxy-tls\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.219663 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.236770 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj598\" (UniqueName: \"kubernetes.io/projected/7a76d0d1-0c0d-47fa-952a-fe34687e34ca-kube-api-access-jj598\") pod \"multus-additional-cni-plugins-7s4rp\" (UID: \"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\") " pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.236779 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8cn\" (UniqueName: \"kubernetes.io/projected/2c07881f-4511-4cd1-9283-6891826b57a1-kube-api-access-zk8cn\") pod \"machine-config-daemon-tftzx\" (UID: \"2c07881f-4511-4cd1-9283-6891826b57a1\") " pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.237102 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lmj7\" (UniqueName: \"kubernetes.io/projected/a5c26787-29de-439a-86b8-920cac6c8ab8-kube-api-access-9lmj7\") pod \"multus-9hxf6\" (UID: \"a5c26787-29de-439a-86b8-920cac6c8ab8\") " pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.245905 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.258624 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.269588 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.284944 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.296181 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.308833 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.310279 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.315393 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9hxf6" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.321476 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" Feb 19 08:45:21 crc kubenswrapper[4788]: W0219 08:45:21.324364 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c07881f_4511_4cd1_9283_6891826b57a1.slice/crio-f6b26c0382165787b5e45b5dc79ff3e2e3e59df885cd1b9bd9d10989b1f0c544 WatchSource:0}: Error finding container f6b26c0382165787b5e45b5dc79ff3e2e3e59df885cd1b9bd9d10989b1f0c544: Status 404 returned error can't find the container with id f6b26c0382165787b5e45b5dc79ff3e2e3e59df885cd1b9bd9d10989b1f0c544 Feb 19 08:45:21 crc kubenswrapper[4788]: W0219 08:45:21.325684 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5c26787_29de_439a_86b8_920cac6c8ab8.slice/crio-a327fa693c9c927cc4eea31f6e7b505646140b7f11d098f4cb0b73f99c9ad5e1 WatchSource:0}: Error finding container a327fa693c9c927cc4eea31f6e7b505646140b7f11d098f4cb0b73f99c9ad5e1: Status 404 returned error can't find the container with id a327fa693c9c927cc4eea31f6e7b505646140b7f11d098f4cb0b73f99c9ad5e1 Feb 19 08:45:21 crc kubenswrapper[4788]: W0219 08:45:21.341225 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a76d0d1_0c0d_47fa_952a_fe34687e34ca.slice/crio-f0cb7c9759d4937c2b3bd173695ac9a49f5ddc268dfae26c9823d8dc5cc6229f WatchSource:0}: Error finding container f0cb7c9759d4937c2b3bd173695ac9a49f5ddc268dfae26c9823d8dc5cc6229f: Status 404 returned error can't find the container with id f0cb7c9759d4937c2b3bd173695ac9a49f5ddc268dfae26c9823d8dc5cc6229f Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.381047 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xmshh"] Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.382751 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.388220 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.388365 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.388877 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.389073 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.389776 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.390126 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.393220 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.411142 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.428399 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.444725 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.458458 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.473483 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.486707 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.499119 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510780 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-bin\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510826 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-slash\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510847 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-kubelet\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510866 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-var-lib-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510884 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-env-overrides\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510898 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6gjm\" (UniqueName: \"kubernetes.io/projected/fd5c1c46-74a4-41f4-ad05-af438781bd6a-kube-api-access-f6gjm\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510926 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-systemd\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510942 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-config\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510965 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510980 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-log-socket\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.510994 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovn-node-metrics-cert\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511010 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-netns\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511025 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-netd\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511041 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511065 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-node-log\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511083 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-script-lib\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511106 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-systemd-units\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511122 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-etc-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511137 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-ovn\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.511179 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.519426 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.535931 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.551763 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.563962 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.575211 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.592682 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612316 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-config\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612366 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612397 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-log-socket\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612433 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovn-node-metrics-cert\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612455 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612468 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-netns\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612496 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-netd\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612528 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612576 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-node-log\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612622 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-script-lib\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612652 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-systemd-units\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612680 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-etc-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612708 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-ovn\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612736 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612769 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-slash\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612796 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-bin\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612829 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-kubelet\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612863 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-var-lib-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612892 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-env-overrides\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612922 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6gjm\" (UniqueName: \"kubernetes.io/projected/fd5c1c46-74a4-41f4-ad05-af438781bd6a-kube-api-access-f6gjm\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612978 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-systemd\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612997 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-etc-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.612497 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-log-socket\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613085 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613110 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-slash\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613124 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-kubelet\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613131 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-systemd-units\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613143 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613167 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-bin\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613162 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-config\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613203 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-systemd\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613175 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-netd\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613281 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-var-lib-openvswitch\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613295 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-netns\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613277 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-ovn\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613214 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-node-log\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.613948 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-script-lib\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.614006 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-env-overrides\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.616476 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovn-node-metrics-cert\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.633136 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6gjm\" (UniqueName: \"kubernetes.io/projected/fd5c1c46-74a4-41f4-ad05-af438781bd6a-kube-api-access-f6gjm\") pod \"ovnkube-node-xmshh\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.669632 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 23:55:56.827269843 +0000 UTC Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.699188 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:21 crc kubenswrapper[4788]: W0219 08:45:21.719548 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd5c1c46_74a4_41f4_ad05_af438781bd6a.slice/crio-58f1537e6e76b51681c65216d8d7d4364812d5e6e56fb239d3392181a50f22c0 WatchSource:0}: Error finding container 58f1537e6e76b51681c65216d8d7d4364812d5e6e56fb239d3392181a50f22c0: Status 404 returned error can't find the container with id 58f1537e6e76b51681c65216d8d7d4364812d5e6e56fb239d3392181a50f22c0 Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.855059 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.855131 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.855150 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"f6b26c0382165787b5e45b5dc79ff3e2e3e59df885cd1b9bd9d10989b1f0c544"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.856794 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.858746 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6" exitCode=0 Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.858833 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.858868 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"58f1537e6e76b51681c65216d8d7d4364812d5e6e56fb239d3392181a50f22c0"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.860411 4788 generic.go:334] "Generic (PLEG): container finished" podID="7a76d0d1-0c0d-47fa-952a-fe34687e34ca" containerID="e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97" exitCode=0 Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.860479 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" event={"ID":"7a76d0d1-0c0d-47fa-952a-fe34687e34ca","Type":"ContainerDied","Data":"e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.860550 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" event={"ID":"7a76d0d1-0c0d-47fa-952a-fe34687e34ca","Type":"ContainerStarted","Data":"f0cb7c9759d4937c2b3bd173695ac9a49f5ddc268dfae26c9823d8dc5cc6229f"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.863621 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hxf6" event={"ID":"a5c26787-29de-439a-86b8-920cac6c8ab8","Type":"ContainerStarted","Data":"217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.863676 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hxf6" event={"ID":"a5c26787-29de-439a-86b8-920cac6c8ab8","Type":"ContainerStarted","Data":"a327fa693c9c927cc4eea31f6e7b505646140b7f11d098f4cb0b73f99c9ad5e1"} Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.875910 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.892040 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.908772 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.929349 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.947219 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.967466 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:21 crc kubenswrapper[4788]: I0219 08:45:21.990846 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:21Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.014410 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.037015 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.061484 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.077618 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.089720 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.107346 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.132944 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.165621 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.182671 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.193663 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.206714 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.224081 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.242275 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.257193 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.273905 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.290987 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.317723 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.358758 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.405585 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.423073 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.423196 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.423226 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423327 4788 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423354 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:45:26.423313248 +0000 UTC m=+28.411324720 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.423400 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423403 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:26.4233944 +0000 UTC m=+28.411405872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.423432 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423428 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423483 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423498 4788 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423505 4788 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423540 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:26.423522383 +0000 UTC m=+28.411534075 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423576 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:26.423554944 +0000 UTC m=+28.411566416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423667 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423687 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423701 4788 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.423753 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:26.423746098 +0000 UTC m=+28.411757570 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.513672 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6lplm"] Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.514662 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.517505 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.518168 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.518178 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.519366 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.539135 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.560784 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.598615 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.625364 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93e79622-b196-49af-8474-1d25444de3ca-host\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.625468 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93e79622-b196-49af-8474-1d25444de3ca-serviceca\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.625550 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z85l\" (UniqueName: \"kubernetes.io/projected/93e79622-b196-49af-8474-1d25444de3ca-kube-api-access-6z85l\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.638495 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.669849 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 22:02:12.732792942 +0000 UTC Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.679433 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.717803 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.717954 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.718008 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.718083 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.718116 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:22 crc kubenswrapper[4788]: E0219 08:45:22.718180 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.718507 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.727132 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93e79622-b196-49af-8474-1d25444de3ca-host\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.727215 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93e79622-b196-49af-8474-1d25444de3ca-serviceca\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.727239 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z85l\" (UniqueName: \"kubernetes.io/projected/93e79622-b196-49af-8474-1d25444de3ca-kube-api-access-6z85l\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.727332 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93e79622-b196-49af-8474-1d25444de3ca-host\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.729429 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93e79622-b196-49af-8474-1d25444de3ca-serviceca\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.769048 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z85l\" (UniqueName: \"kubernetes.io/projected/93e79622-b196-49af-8474-1d25444de3ca-kube-api-access-6z85l\") pod \"node-ca-6lplm\" (UID: \"93e79622-b196-49af-8474-1d25444de3ca\") " pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.779356 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.821333 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.854091 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.871160 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.871233 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.871274 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.871295 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.873456 4788 generic.go:334] "Generic (PLEG): container finished" podID="7a76d0d1-0c0d-47fa-952a-fe34687e34ca" containerID="58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250" exitCode=0 Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.873527 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" event={"ID":"7a76d0d1-0c0d-47fa-952a-fe34687e34ca","Type":"ContainerDied","Data":"58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250"} Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.897049 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.931280 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6lplm" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.938268 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:22 crc kubenswrapper[4788]: I0219 08:45:22.976814 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:22Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.018547 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.061141 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.102928 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.138021 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.178761 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.214527 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.256077 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.294692 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.334890 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.375015 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.414960 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.457712 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.496805 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.545823 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.577050 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.622804 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.670880 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 16:06:53.681598918 +0000 UTC Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.881407 4788 generic.go:334] "Generic (PLEG): container finished" podID="7a76d0d1-0c0d-47fa-952a-fe34687e34ca" containerID="858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868" exitCode=0 Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.881507 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" event={"ID":"7a76d0d1-0c0d-47fa-952a-fe34687e34ca","Type":"ContainerDied","Data":"858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868"} Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.886988 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.887034 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.889052 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6lplm" event={"ID":"93e79622-b196-49af-8474-1d25444de3ca","Type":"ContainerStarted","Data":"b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917"} Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.889128 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6lplm" event={"ID":"93e79622-b196-49af-8474-1d25444de3ca","Type":"ContainerStarted","Data":"6e27d5eca1e06f73a0a0ac76fe2084f751b3270ac1549033424ff3e6cc60209d"} Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.901042 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.926928 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.944707 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.958199 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.972736 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:23 crc kubenswrapper[4788]: I0219 08:45:23.988949 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:23Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.009509 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.023915 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.041422 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.055018 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.065525 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.096391 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.139114 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.178130 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.216875 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.256527 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.298831 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.302592 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.306889 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.337486 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.372619 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.405829 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.434261 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.472942 4788 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.474650 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.474691 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.474710 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.474825 4788 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.482331 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.529940 4788 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.530313 4788 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.531624 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.531675 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.531688 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.531705 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.531717 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.557896 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.561567 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.561596 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.561605 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.561618 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.561628 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.569336 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.575543 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.580199 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.580282 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.580297 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.580317 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.580329 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.596921 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.598394 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.601807 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.601848 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.601859 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.601875 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.601888 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.613398 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.619893 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.619968 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.619986 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.620011 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.620028 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.633696 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.633925 4788 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.636445 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.636488 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.636498 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.636515 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.636527 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.639096 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.671415 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:05:40.152449769 +0000 UTC Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.676026 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.713710 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.713786 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.713719 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.713873 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.714004 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:24 crc kubenswrapper[4788]: E0219 08:45:24.714084 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.719774 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.739538 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.739598 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.739615 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.739641 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.739659 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.756088 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.802093 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.841213 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.842860 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.842904 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.842921 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.842944 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.842963 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.882357 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.895280 4788 generic.go:334] "Generic (PLEG): container finished" podID="7a76d0d1-0c0d-47fa-952a-fe34687e34ca" containerID="d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc" exitCode=0 Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.895308 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" event={"ID":"7a76d0d1-0c0d-47fa-952a-fe34687e34ca","Type":"ContainerDied","Data":"d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc"} Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.921689 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.944889 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.944921 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.944929 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.944942 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.944951 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:24Z","lastTransitionTime":"2026-02-19T08:45:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:24 crc kubenswrapper[4788]: I0219 08:45:24.958693 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.000833 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:24Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.043590 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.047072 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.047110 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.047119 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.047133 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.047142 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.083592 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.119481 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.150522 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.150579 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.150592 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.150610 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.150621 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.172922 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.197787 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.239673 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.252551 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.252594 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.252607 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.252628 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.252645 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.278211 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.316557 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.359445 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.359490 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.359502 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.359518 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.359531 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.366545 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.401374 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.438498 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.462707 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.463002 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.463171 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.463532 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.463762 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.483364 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.517803 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.558150 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.567052 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.567092 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.567101 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.567116 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.567126 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.603410 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.647954 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.669424 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.669466 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.669478 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.669495 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.669508 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.672305 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 08:06:20.361275296 +0000 UTC Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.693136 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.738176 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.764190 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.772077 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.772136 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.772156 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.772180 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.772198 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.800157 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.841695 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.875494 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.875577 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.875606 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.875637 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.875660 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.887914 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.902304 4788 generic.go:334] "Generic (PLEG): container finished" podID="7a76d0d1-0c0d-47fa-952a-fe34687e34ca" containerID="b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6" exitCode=0 Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.902418 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" event={"ID":"7a76d0d1-0c0d-47fa-952a-fe34687e34ca","Type":"ContainerDied","Data":"b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.909790 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.927197 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.964899 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:25Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.979603 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.979650 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.979661 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.979681 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:25 crc kubenswrapper[4788]: I0219 08:45:25.979692 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:25Z","lastTransitionTime":"2026-02-19T08:45:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.004282 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.048893 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.081521 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.081566 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.081577 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.081597 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.081609 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.087658 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.127361 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.160085 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.184104 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.184158 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.184173 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.184195 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.184212 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.203166 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.239436 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.247786 4788 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.286388 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.286450 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.286468 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.286492 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.286512 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.306147 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.341284 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.390174 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.390538 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.390707 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.390846 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.390974 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.396506 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.426229 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.461974 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.467282 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.467767 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:45:34.467718226 +0000 UTC m=+36.455729738 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.467864 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.467944 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.468036 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.468099 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468187 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468271 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468295 4788 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468326 4788 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468377 4788 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468403 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468438 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468384 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:34.468358991 +0000 UTC m=+36.456370493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468462 4788 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468494 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:34.468465473 +0000 UTC m=+36.456476975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468531 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:34.468508134 +0000 UTC m=+36.456519716 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.468563 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:34.468547585 +0000 UTC m=+36.456559217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.493220 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.493300 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.493318 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.493340 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.493358 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.502835 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.543060 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.589877 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.595765 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.595821 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.595837 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.595863 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.595880 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.621883 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.672726 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 19:24:52.467209154 +0000 UTC Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.699089 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.699145 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.699159 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.699177 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.699190 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.713446 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.713537 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.713578 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.713625 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.713704 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:26 crc kubenswrapper[4788]: E0219 08:45:26.713849 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.802334 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.802402 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.802419 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.802444 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.802461 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.906180 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.906297 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.906324 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.906354 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.906373 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:26Z","lastTransitionTime":"2026-02-19T08:45:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.919653 4788 generic.go:334] "Generic (PLEG): container finished" podID="7a76d0d1-0c0d-47fa-952a-fe34687e34ca" containerID="914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85" exitCode=0 Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.919693 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" event={"ID":"7a76d0d1-0c0d-47fa-952a-fe34687e34ca","Type":"ContainerDied","Data":"914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85"} Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.944644 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.972754 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:26 crc kubenswrapper[4788]: I0219 08:45:26.989906 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:26Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.009136 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.011539 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.011601 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.011620 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.011648 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.011665 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.023043 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.040224 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.061171 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.083984 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.103406 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.115460 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.115510 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.115704 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.115722 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.115746 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.115764 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.131275 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.151380 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.171221 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.193015 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.217711 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.219355 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.219397 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.219408 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.219422 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.219432 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.321944 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.322020 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.322041 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.322071 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.322093 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.425338 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.425384 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.425401 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.425421 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.425436 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.528627 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.528666 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.528679 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.528700 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.528715 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.630789 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.630822 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.630832 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.630848 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.630860 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.673725 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 14:10:02.415010484 +0000 UTC Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.733469 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.733506 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.733514 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.733529 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.733541 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.837009 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.837053 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.837065 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.837083 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.837096 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.928209 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" event={"ID":"7a76d0d1-0c0d-47fa-952a-fe34687e34ca","Type":"ContainerStarted","Data":"4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.932944 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.933211 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.939699 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.939731 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.939742 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.939757 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.939769 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:27Z","lastTransitionTime":"2026-02-19T08:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.950697 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.963768 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.970270 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.978151 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:27 crc kubenswrapper[4788]: I0219 08:45:27.997354 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:27Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.019508 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.033473 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.044830 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.044892 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.044913 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.044939 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.044973 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.065651 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.095369 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.108708 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.119001 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.131223 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.146771 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.147572 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.147629 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.147648 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.147672 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.147689 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.163743 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.181342 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.199994 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.217525 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.235510 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.249638 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.249676 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.249684 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.249699 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.249708 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.256831 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.274572 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.290065 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.305359 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.321656 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.336751 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.350002 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.351888 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.351922 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.351931 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.351956 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.351968 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.365390 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.395286 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.420010 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.451390 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.453921 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.453954 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.453966 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.453983 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.453994 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.473492 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.484765 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.557414 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.557487 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.557506 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.557533 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.557553 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.659948 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.660009 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.660027 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.660052 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.660070 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.674499 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:19:46.073074759 +0000 UTC Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.713667 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.713771 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:28 crc kubenswrapper[4788]: E0219 08:45:28.713835 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.713855 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:28 crc kubenswrapper[4788]: E0219 08:45:28.713972 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:28 crc kubenswrapper[4788]: E0219 08:45:28.714091 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.731134 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.762546 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.763218 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.763293 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.763304 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.763325 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.763336 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.784432 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.799672 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.819604 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.844795 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.865914 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.865960 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.865979 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.866003 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.866020 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.868771 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.882934 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.891091 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.910739 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.930574 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.935762 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.936262 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.947036 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.960494 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.965171 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.967841 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.967890 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.967900 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.967911 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.967920 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:28Z","lastTransitionTime":"2026-02-19T08:45:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.977498 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:28 crc kubenswrapper[4788]: I0219 08:45:28.988636 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:28Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.023529 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.070311 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.070364 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.070384 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.070412 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.070432 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.070466 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.112209 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.142288 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.173215 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.173291 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.173303 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.173323 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.173334 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.174644 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.216642 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.260833 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.275026 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.275066 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.275074 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.275105 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.275116 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.302076 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.341016 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.377775 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.377822 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.377836 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.377854 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.377868 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.384030 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.422236 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.463682 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.470389 4788 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.480771 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.480832 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.480850 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.480875 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.480894 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.522570 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.564065 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.586006 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.586049 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.586059 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.586096 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.586107 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.601500 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.638444 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:29Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.675100 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 12:30:34.6830613 +0000 UTC Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.689131 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.689173 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.689182 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.689197 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.689208 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.791676 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.791896 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.791923 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.791955 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.791979 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.895668 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.895735 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.895761 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.895794 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.895818 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.922101 4788 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.939367 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.998523 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.998561 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.998573 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.998589 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:29 crc kubenswrapper[4788]: I0219 08:45:29.998603 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:29Z","lastTransitionTime":"2026-02-19T08:45:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.100782 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.100838 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.100854 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.100875 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.100890 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.154597 4788 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.203441 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.203491 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.203503 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.203522 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.203535 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.306239 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.306328 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.306345 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.306369 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.306394 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.408938 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.408981 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.408991 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.409010 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.409022 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.511502 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.511548 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.511564 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.511585 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.511601 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.614384 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.614449 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.614472 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.614504 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.614526 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.676215 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 14:48:26.74406953 +0000 UTC Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.713710 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.713723 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:30 crc kubenswrapper[4788]: E0219 08:45:30.714034 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.713787 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:30 crc kubenswrapper[4788]: E0219 08:45:30.713952 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:30 crc kubenswrapper[4788]: E0219 08:45:30.714177 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.717141 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.717176 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.717200 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.717221 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.717236 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.833095 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.833133 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.833142 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.833155 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.833164 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.935570 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.935607 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.935616 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.935630 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.935638 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:30Z","lastTransitionTime":"2026-02-19T08:45:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.945112 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/0.log" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.948749 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1" exitCode=1 Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.948817 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1"} Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.949851 4788 scope.go:117] "RemoveContainer" containerID="748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.964813 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:30 crc kubenswrapper[4788]: I0219 08:45:30.984696 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.001529 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:30Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.023762 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.039041 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.039073 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.039086 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.039102 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.039112 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.042189 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.059619 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.084326 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:30Z\\\",\\\"message\\\":\\\"go:160\\\\nI0219 08:45:30.554031 6086 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 08:45:30.554440 6086 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 08:45:30.554470 6086 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:30.554477 6086 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:30.554501 6086 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 08:45:30.554526 6086 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:30.554536 6086 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:30.554578 6086 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 08:45:30.554601 6086 factory.go:656] Stopping watch factory\\\\nI0219 08:45:30.554622 6086 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 08:45:30.554626 6086 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:30.554668 6086 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 08:45:30.554632 6086 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 08:45:30.554648 6086 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:30.554777 6086 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.097893 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.110942 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.135550 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.141320 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.141383 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.141403 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.141426 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.141442 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.153276 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.169733 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.189418 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.207822 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.221184 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.244587 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.244626 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.244672 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.244694 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.244710 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.347510 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.347564 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.347581 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.347606 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.347624 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.450809 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.450881 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.450898 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.450924 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.450942 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.553343 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.553395 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.553439 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.553462 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.553476 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.656433 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.656474 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.656485 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.656502 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.656515 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.677063 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 14:22:24.204951561 +0000 UTC Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.758991 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.759035 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.759046 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.759064 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.759078 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.861598 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.861646 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.861659 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.861677 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.861691 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.955040 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/0.log" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.959404 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.959561 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.964476 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.964520 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.964536 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.964558 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.964579 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:31Z","lastTransitionTime":"2026-02-19T08:45:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:31 crc kubenswrapper[4788]: I0219 08:45:31.995827 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:30Z\\\",\\\"message\\\":\\\"go:160\\\\nI0219 08:45:30.554031 6086 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 08:45:30.554440 6086 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 08:45:30.554470 6086 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:30.554477 6086 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:30.554501 6086 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 08:45:30.554526 6086 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:30.554536 6086 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:30.554578 6086 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 08:45:30.554601 6086 factory.go:656] Stopping watch factory\\\\nI0219 08:45:30.554622 6086 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 08:45:30.554626 6086 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:30.554668 6086 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 08:45:30.554632 6086 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 08:45:30.554648 6086 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:30.554777 6086 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:31Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.028700 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.041478 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.052618 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.066222 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.067164 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.067204 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.067217 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.067237 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.067271 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.084348 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.094628 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.110161 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.124569 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.137660 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.152276 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.168795 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.171314 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.171366 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.171379 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.171403 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.171418 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.183962 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.201356 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.211392 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2"] Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.212037 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.214542 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.215046 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.226420 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.243821 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.259269 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.273367 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.273413 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.273422 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.273435 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.273443 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.274462 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.287330 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.299439 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.312828 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.329903 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.342968 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.347799 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/798f2a0d-d6be-46a9-83e5-67a10abcce47-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.347834 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/798f2a0d-d6be-46a9-83e5-67a10abcce47-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.347856 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/798f2a0d-d6be-46a9-83e5-67a10abcce47-env-overrides\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.347875 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfj55\" (UniqueName: \"kubernetes.io/projected/798f2a0d-d6be-46a9-83e5-67a10abcce47-kube-api-access-wfj55\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.354862 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.368011 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.375744 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.375980 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.376196 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.376428 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.376635 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.386158 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:30Z\\\",\\\"message\\\":\\\"go:160\\\\nI0219 08:45:30.554031 6086 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 08:45:30.554440 6086 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 08:45:30.554470 6086 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:30.554477 6086 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:30.554501 6086 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 08:45:30.554526 6086 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:30.554536 6086 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:30.554578 6086 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 08:45:30.554601 6086 factory.go:656] Stopping watch factory\\\\nI0219 08:45:30.554622 6086 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 08:45:30.554626 6086 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:30.554668 6086 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 08:45:30.554632 6086 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 08:45:30.554648 6086 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:30.554777 6086 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.399286 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.420989 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.439403 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.449258 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/798f2a0d-d6be-46a9-83e5-67a10abcce47-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.449330 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/798f2a0d-d6be-46a9-83e5-67a10abcce47-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.449364 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/798f2a0d-d6be-46a9-83e5-67a10abcce47-env-overrides\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.449385 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfj55\" (UniqueName: \"kubernetes.io/projected/798f2a0d-d6be-46a9-83e5-67a10abcce47-kube-api-access-wfj55\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.449924 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.450825 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/798f2a0d-d6be-46a9-83e5-67a10abcce47-env-overrides\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.451311 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/798f2a0d-d6be-46a9-83e5-67a10abcce47-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.455552 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/798f2a0d-d6be-46a9-83e5-67a10abcce47-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.470041 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.473648 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfj55\" (UniqueName: \"kubernetes.io/projected/798f2a0d-d6be-46a9-83e5-67a10abcce47-kube-api-access-wfj55\") pod \"ovnkube-control-plane-749d76644c-88dw2\" (UID: \"798f2a0d-d6be-46a9-83e5-67a10abcce47\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.478840 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.478877 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.478892 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.478910 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.478922 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.527057 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" Feb 19 08:45:32 crc kubenswrapper[4788]: W0219 08:45:32.550921 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod798f2a0d_d6be_46a9_83e5_67a10abcce47.slice/crio-9606ba475dfe0f5d1e212d85dcde3712bb331be6d80e098bf3ea7ac622ce4957 WatchSource:0}: Error finding container 9606ba475dfe0f5d1e212d85dcde3712bb331be6d80e098bf3ea7ac622ce4957: Status 404 returned error can't find the container with id 9606ba475dfe0f5d1e212d85dcde3712bb331be6d80e098bf3ea7ac622ce4957 Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.582877 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.582939 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.582956 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.582980 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.583000 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.677833 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 14:27:44.178511816 +0000 UTC Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.687766 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.687849 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.687873 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.687906 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.687933 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.714625 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.714679 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.714824 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:32 crc kubenswrapper[4788]: E0219 08:45:32.714835 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:32 crc kubenswrapper[4788]: E0219 08:45:32.714957 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:32 crc kubenswrapper[4788]: E0219 08:45:32.715066 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.790297 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.790349 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.790363 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.790383 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.790396 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.894001 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.894087 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.894113 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.894148 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.894170 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.964988 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" event={"ID":"798f2a0d-d6be-46a9-83e5-67a10abcce47","Type":"ContainerStarted","Data":"1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.965049 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" event={"ID":"798f2a0d-d6be-46a9-83e5-67a10abcce47","Type":"ContainerStarted","Data":"9606ba475dfe0f5d1e212d85dcde3712bb331be6d80e098bf3ea7ac622ce4957"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.969800 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/1.log" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.970827 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/0.log" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.975200 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f" exitCode=1 Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.975319 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f"} Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.975445 4788 scope.go:117] "RemoveContainer" containerID="748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.976071 4788 scope.go:117] "RemoveContainer" containerID="20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f" Feb 19 08:45:32 crc kubenswrapper[4788]: E0219 08:45:32.976389 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.997098 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.997138 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.997149 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.997168 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:32 crc kubenswrapper[4788]: I0219 08:45:32.997182 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:32Z","lastTransitionTime":"2026-02-19T08:45:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.000300 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:32Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.057621 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.077919 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.095190 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.100682 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.100740 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.100758 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.100782 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.100801 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.112675 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.122728 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.132674 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.153904 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://748a03c199c7cf637b7a381d52fd4d56091d061b5b3bef0801e9eede2a07b0f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:30Z\\\",\\\"message\\\":\\\"go:160\\\\nI0219 08:45:30.554031 6086 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0219 08:45:30.554440 6086 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 08:45:30.554470 6086 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:30.554477 6086 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:30.554501 6086 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 08:45:30.554526 6086 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:30.554536 6086 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:30.554578 6086 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 08:45:30.554601 6086 factory.go:656] Stopping watch factory\\\\nI0219 08:45:30.554622 6086 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0219 08:45:30.554626 6086 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:30.554668 6086 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 08:45:30.554632 6086 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 08:45:30.554648 6086 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:30.554777 6086 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0219 08:45:32.230354 6203 factory.go:1336] Added *v1.Pod event handler 3\\\\nI0219 08:45:32.230371 6203 admin_network_policy_controller.go:133] Setting up event handlers for Admin Network Policy\\\\nI0219 08:45:32.230414 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:32.230445 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 08:45:32.230454 6203 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0219 08:45:32.230529 6203 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.176990 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.192295 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.203261 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.203305 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.203319 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.203337 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.203353 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.212607 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.246987 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.265556 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.277699 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.294074 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.305872 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.307544 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.307566 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.307591 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.307610 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.314284 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:33Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.410355 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.410404 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.410415 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.410433 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.410444 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.513707 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.513749 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.513761 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.513777 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.513788 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.616437 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.616500 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.616519 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.616544 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.616561 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.678089 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 10:42:35.041302244 +0000 UTC Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.720016 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.720057 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.720068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.720084 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.720097 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.822010 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.822055 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.822068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.822087 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.822104 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.925641 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.925716 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.925743 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.925780 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.925804 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:33Z","lastTransitionTime":"2026-02-19T08:45:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.980737 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/1.log" Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.988440 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" event={"ID":"798f2a0d-d6be-46a9-83e5-67a10abcce47","Type":"ContainerStarted","Data":"8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203"} Feb 19 08:45:33 crc kubenswrapper[4788]: I0219 08:45:33.990404 4788 scope.go:117] "RemoveContainer" containerID="20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f" Feb 19 08:45:33 crc kubenswrapper[4788]: E0219 08:45:33.990856 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.008629 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.024014 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.028474 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.028539 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.028552 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.028571 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.028582 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.041488 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.079132 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.080046 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qbwlq"] Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.080908 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.081055 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.101561 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.119837 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.131858 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.131995 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.132016 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.132038 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.132055 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.140087 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.161816 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.184035 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.204073 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.234954 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.235022 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.235040 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.235068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.235085 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.250212 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.276086 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2twqj\" (UniqueName: \"kubernetes.io/projected/ad68454a-3350-49a5-9047-8b78e81ec79c-kube-api-access-2twqj\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.276160 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.282736 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.298019 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.308264 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.319644 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.338749 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.338788 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.338801 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.338818 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.338830 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.342954 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0219 08:45:32.230354 6203 factory.go:1336] Added *v1.Pod event handler 3\\\\nI0219 08:45:32.230371 6203 admin_network_policy_controller.go:133] Setting up event handlers for Admin Network Policy\\\\nI0219 08:45:32.230414 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:32.230445 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 08:45:32.230454 6203 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0219 08:45:32.230529 6203 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.356974 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.372276 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.377533 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2twqj\" (UniqueName: \"kubernetes.io/projected/ad68454a-3350-49a5-9047-8b78e81ec79c-kube-api-access-2twqj\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.377673 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.377836 4788 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.377928 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs podName:ad68454a-3350-49a5-9047-8b78e81ec79c nodeName:}" failed. No retries permitted until 2026-02-19 08:45:34.877903359 +0000 UTC m=+36.865914871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs") pod "network-metrics-daemon-qbwlq" (UID: "ad68454a-3350-49a5-9047-8b78e81ec79c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.388341 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.407108 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.417543 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2twqj\" (UniqueName: \"kubernetes.io/projected/ad68454a-3350-49a5-9047-8b78e81ec79c-kube-api-access-2twqj\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.423396 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.437993 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.442404 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.442456 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.442473 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.442499 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.442515 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.458889 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.475484 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.478227 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.478457 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.478537 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.478603 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:45:50.478575612 +0000 UTC m=+52.466587084 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.478646 4788 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.478652 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.478763 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.478899 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.478927 4788 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.478769 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.478989 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.479005 4788 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.478773 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:50.478748336 +0000 UTC m=+52.466759868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.479082 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.479133 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:50.479094834 +0000 UTC m=+52.467106366 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.479151 4788 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.479190 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:50.479172806 +0000 UTC m=+52.467184388 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.479236 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:45:50.479221777 +0000 UTC m=+52.467233359 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.494558 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.505501 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.516960 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.527579 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.544769 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.544813 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.544822 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.544840 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.544853 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.553797 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0219 08:45:32.230354 6203 factory.go:1336] Added *v1.Pod event handler 3\\\\nI0219 08:45:32.230371 6203 admin_network_policy_controller.go:133] Setting up event handlers for Admin Network Policy\\\\nI0219 08:45:32.230414 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:32.230445 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 08:45:32.230454 6203 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0219 08:45:32.230529 6203 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.571862 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.584208 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.596896 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.617895 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.647930 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.647996 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.648014 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.648040 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.648058 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.679281 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 22:40:40.849676538 +0000 UTC Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.713967 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.714028 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.714071 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.714166 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.714270 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.714399 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.750922 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.750991 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.751010 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.751041 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.751060 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.854058 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.854129 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.854150 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.854179 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.854200 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.882911 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.883090 4788 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.883171 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs podName:ad68454a-3350-49a5-9047-8b78e81ec79c nodeName:}" failed. No retries permitted until 2026-02-19 08:45:35.88314884 +0000 UTC m=+37.871160342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs") pod "network-metrics-daemon-qbwlq" (UID: "ad68454a-3350-49a5-9047-8b78e81ec79c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.903977 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.904218 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.904227 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.904261 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.904277 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.933542 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.938901 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.938956 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.938966 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.938987 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.938999 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.958350 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.964154 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.964189 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.964201 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.964218 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.964230 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:34 crc kubenswrapper[4788]: E0219 08:45:34.981812 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:34Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.986072 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.986103 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.986111 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.986126 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:34 crc kubenswrapper[4788]: I0219 08:45:34.986137 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:34Z","lastTransitionTime":"2026-02-19T08:45:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: E0219 08:45:35.007830 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:35Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.012541 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.012596 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.012607 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.012624 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.012663 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: E0219 08:45:35.031338 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:35Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:35 crc kubenswrapper[4788]: E0219 08:45:35.031932 4788 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.034270 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.034376 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.034400 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.034433 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.034453 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.138395 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.138460 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.138473 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.138494 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.138507 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.242204 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.242296 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.242314 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.242358 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.242372 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.346240 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.346346 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.346366 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.346400 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.346419 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.449652 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.449704 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.449736 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.449761 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.449773 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.553286 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.553326 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.553337 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.553352 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.553364 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.656154 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.656210 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.656227 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.656274 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.656293 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.679970 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 19:21:35.845355299 +0000 UTC Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.713298 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:35 crc kubenswrapper[4788]: E0219 08:45:35.713483 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.759552 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.759594 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.759602 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.759619 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.759632 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.863348 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.863410 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.863425 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.863450 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.863469 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.897601 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:35 crc kubenswrapper[4788]: E0219 08:45:35.897855 4788 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:35 crc kubenswrapper[4788]: E0219 08:45:35.897955 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs podName:ad68454a-3350-49a5-9047-8b78e81ec79c nodeName:}" failed. No retries permitted until 2026-02-19 08:45:37.897927481 +0000 UTC m=+39.885938993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs") pod "network-metrics-daemon-qbwlq" (UID: "ad68454a-3350-49a5-9047-8b78e81ec79c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.966689 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.966775 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.966802 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.966841 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:35 crc kubenswrapper[4788]: I0219 08:45:35.966867 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:35Z","lastTransitionTime":"2026-02-19T08:45:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.069782 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.069865 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.069887 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.069917 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.069938 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.171724 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.171769 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.171777 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.171791 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.171800 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.274792 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.274843 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.274854 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.274875 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.274888 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.378655 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.378729 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.378753 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.378786 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.378807 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.481886 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.481949 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.481967 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.481989 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.482007 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.584863 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.584942 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.584967 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.584995 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.585012 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.681144 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 01:01:35.006953689 +0000 UTC Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.687355 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.687427 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.687450 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.687481 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.687504 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.714088 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.714221 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:36 crc kubenswrapper[4788]: E0219 08:45:36.714336 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.714364 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:36 crc kubenswrapper[4788]: E0219 08:45:36.714558 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:36 crc kubenswrapper[4788]: E0219 08:45:36.714680 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.792230 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.792302 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.792319 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.792343 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.792359 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.895874 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.895945 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.895957 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.895974 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.895987 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.998356 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.998408 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.998418 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.998438 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:36 crc kubenswrapper[4788]: I0219 08:45:36.998451 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:36Z","lastTransitionTime":"2026-02-19T08:45:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.101486 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.101555 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.101577 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.101608 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.101636 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.204958 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.205025 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.205043 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.205066 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.205083 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.309008 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.309099 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.309123 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.309157 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.309181 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.412847 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.412940 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.412963 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.413001 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.413049 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.516313 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.516390 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.516401 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.516420 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.516433 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.618982 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.619048 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.619068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.619094 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.619115 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.681477 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 12:54:10.739460224 +0000 UTC Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.714091 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:37 crc kubenswrapper[4788]: E0219 08:45:37.714410 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.722869 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.722940 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.722960 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.722986 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.723006 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.827146 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.827232 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.827281 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.827314 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.827337 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.930694 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.930798 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.930866 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.930902 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.930923 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:37Z","lastTransitionTime":"2026-02-19T08:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:37 crc kubenswrapper[4788]: I0219 08:45:37.940990 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:37 crc kubenswrapper[4788]: E0219 08:45:37.941176 4788 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:37 crc kubenswrapper[4788]: E0219 08:45:37.941290 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs podName:ad68454a-3350-49a5-9047-8b78e81ec79c nodeName:}" failed. No retries permitted until 2026-02-19 08:45:41.941232171 +0000 UTC m=+43.929243643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs") pod "network-metrics-daemon-qbwlq" (UID: "ad68454a-3350-49a5-9047-8b78e81ec79c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.033405 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.033454 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.033471 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.033494 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.033510 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.136738 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.136792 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.136808 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.136831 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.136848 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.239405 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.239470 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.239494 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.239521 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.239545 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.342032 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.342088 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.342105 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.342128 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.342146 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.445288 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.445350 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.445373 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.445398 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.445416 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.548614 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.548715 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.548740 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.548767 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.548788 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.651660 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.651716 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.651732 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.651756 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.651774 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.682239 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 18:26:31.290445084 +0000 UTC Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.713911 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:38 crc kubenswrapper[4788]: E0219 08:45:38.714128 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.714207 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.714273 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:38 crc kubenswrapper[4788]: E0219 08:45:38.714425 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:38 crc kubenswrapper[4788]: E0219 08:45:38.714636 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.748627 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0219 08:45:32.230354 6203 factory.go:1336] Added *v1.Pod event handler 3\\\\nI0219 08:45:32.230371 6203 admin_network_policy_controller.go:133] Setting up event handlers for Admin Network Policy\\\\nI0219 08:45:32.230414 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:32.230445 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 08:45:32.230454 6203 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0219 08:45:32.230529 6203 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.754810 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.755150 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.755385 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.755590 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.755772 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.769321 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.787523 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.828338 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.854115 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.862949 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.863034 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.863062 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.863102 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.863226 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.872360 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.891221 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.912907 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.932060 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.954217 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.966452 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.966508 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.966529 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.966568 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.966593 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:38Z","lastTransitionTime":"2026-02-19T08:45:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:38 crc kubenswrapper[4788]: I0219 08:45:38.974813 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.005124 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:38Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.023648 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:39Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.046280 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:39Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.065684 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:39Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.069652 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.069701 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.069719 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.069743 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.069761 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.085003 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:39Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.104895 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:39Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.173679 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.173753 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.173773 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.173805 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.173828 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.277161 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.277207 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.277218 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.277235 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.277272 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.380825 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.380907 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.380931 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.380967 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.380991 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.484232 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.484313 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.484329 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.484354 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.484371 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.586777 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.586822 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.586832 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.586850 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.586862 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.682595 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 04:53:49.805199169 +0000 UTC Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.690615 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.690642 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.690651 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.690666 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.690676 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.714072 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:39 crc kubenswrapper[4788]: E0219 08:45:39.714238 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.793957 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.794050 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.794067 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.794088 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.794105 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.897123 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.897225 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.897276 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.897308 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.897328 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:39Z","lastTransitionTime":"2026-02-19T08:45:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.999904 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:39 crc kubenswrapper[4788]: I0219 08:45:39.999970 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:39.999992 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.000019 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.000084 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.103118 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.103177 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.103194 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.103218 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.103236 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.206666 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.206718 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.206734 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.206756 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.206772 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.310086 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.310574 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.310757 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.310965 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.311135 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.413544 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.413584 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.413593 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.413610 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.413620 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.516671 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.516847 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.516872 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.516906 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.516930 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.620308 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.620376 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.620401 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.620430 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.620454 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.683704 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 21:23:00.046393341 +0000 UTC Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.714050 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.714090 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:40 crc kubenswrapper[4788]: E0219 08:45:40.714305 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.714355 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:40 crc kubenswrapper[4788]: E0219 08:45:40.714514 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:40 crc kubenswrapper[4788]: E0219 08:45:40.714617 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.723229 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.723337 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.723361 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.723391 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.723416 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.826977 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.827085 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.827110 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.827141 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.827163 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.930392 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.930471 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.930493 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.930526 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:40 crc kubenswrapper[4788]: I0219 08:45:40.930547 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:40Z","lastTransitionTime":"2026-02-19T08:45:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.033674 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.033749 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.033766 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.033794 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.033812 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.136996 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.137071 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.137098 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.137134 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.137153 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.240570 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.240641 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.240658 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.240682 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.240699 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.343977 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.344052 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.344074 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.344103 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.344126 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.447764 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.447840 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.447860 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.447885 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.447906 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.550621 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.550703 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.550725 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.550756 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.550775 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.654470 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.654545 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.654559 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.654575 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.654610 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.684504 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 13:07:28.898806204 +0000 UTC Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.713906 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:41 crc kubenswrapper[4788]: E0219 08:45:41.714115 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.757770 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.757808 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.757819 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.757860 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.757873 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.860685 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.860772 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.860797 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.860829 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.860855 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.963697 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.963735 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.963743 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.963757 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.963766 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:41Z","lastTransitionTime":"2026-02-19T08:45:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:41 crc kubenswrapper[4788]: I0219 08:45:41.988584 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:41 crc kubenswrapper[4788]: E0219 08:45:41.988774 4788 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:41 crc kubenswrapper[4788]: E0219 08:45:41.988881 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs podName:ad68454a-3350-49a5-9047-8b78e81ec79c nodeName:}" failed. No retries permitted until 2026-02-19 08:45:49.988851363 +0000 UTC m=+51.976862875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs") pod "network-metrics-daemon-qbwlq" (UID: "ad68454a-3350-49a5-9047-8b78e81ec79c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.066220 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.066349 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.066361 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.066377 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.066389 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.169984 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.170078 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.170097 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.170121 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.170138 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.273856 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.273930 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.273953 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.273986 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.274013 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.377425 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.377491 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.377507 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.377529 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.377550 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.481121 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.481202 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.481225 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.481291 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.481312 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.584703 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.584769 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.584790 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.584818 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.584844 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.685492 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 15:33:00.598498456 +0000 UTC Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.687583 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.687657 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.687678 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.687703 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.687722 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.714297 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.714339 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.714453 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:42 crc kubenswrapper[4788]: E0219 08:45:42.714591 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:42 crc kubenswrapper[4788]: E0219 08:45:42.714771 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:42 crc kubenswrapper[4788]: E0219 08:45:42.714976 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.791465 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.791547 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.791573 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.791603 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.791626 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.895575 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.895935 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.896080 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.896220 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.896440 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.999664 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:42 crc kubenswrapper[4788]: I0219 08:45:42.999736 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:42.999758 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:42.999789 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:42.999814 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:42Z","lastTransitionTime":"2026-02-19T08:45:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.103492 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.103567 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.103591 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.103627 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.103651 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.206763 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.206823 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.206846 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.206875 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.206901 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.310700 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.310754 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.310771 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.310793 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.310810 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.413857 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.413932 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.413956 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.413984 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.414005 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.516221 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.516313 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.516345 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.516371 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.516386 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.618741 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.618805 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.618822 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.618847 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.618865 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.685878 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 01:18:24.18945701 +0000 UTC Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.713679 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:43 crc kubenswrapper[4788]: E0219 08:45:43.713868 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.721367 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.721420 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.721437 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.721459 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.721475 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.824024 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.824076 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.824096 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.824117 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.824132 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.927111 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.927178 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.927198 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.927658 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:43 crc kubenswrapper[4788]: I0219 08:45:43.927930 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:43Z","lastTransitionTime":"2026-02-19T08:45:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.030035 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.030097 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.030115 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.030133 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.030146 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.134080 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.134148 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.134172 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.134201 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.134297 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.237199 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.237300 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.237323 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.237353 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.237377 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.340493 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.340562 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.340583 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.340612 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.340634 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.444046 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.444110 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.444132 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.444158 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.444177 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.547460 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.547524 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.547542 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.547568 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.547586 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.649817 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.649886 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.649914 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.649945 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.649967 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.686937 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 12:32:12.619183464 +0000 UTC Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.713585 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:44 crc kubenswrapper[4788]: E0219 08:45:44.713781 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.713946 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.713976 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:44 crc kubenswrapper[4788]: E0219 08:45:44.714338 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:44 crc kubenswrapper[4788]: E0219 08:45:44.714470 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.753284 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.753333 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.753356 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.753383 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.753401 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.858551 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.858607 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.858624 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.858652 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.858671 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.962551 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.962629 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.962648 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.962676 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:44 crc kubenswrapper[4788]: I0219 08:45:44.962698 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:44Z","lastTransitionTime":"2026-02-19T08:45:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.065815 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.065890 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.065909 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.065940 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.065961 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.168871 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.168924 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.168935 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.168952 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.168964 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.271917 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.272011 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.272037 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.272071 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.272099 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.375397 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.375472 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.375495 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.375528 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.375552 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.404901 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.404972 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.404990 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.405014 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.405033 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: E0219 08:45:45.426945 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.432571 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.432641 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.432666 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.432702 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.432727 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: E0219 08:45:45.454922 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.460713 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.460807 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.460826 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.460851 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.460869 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: E0219 08:45:45.482344 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.487402 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.487467 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.487485 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.487510 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.487530 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: E0219 08:45:45.506559 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.511876 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.511961 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.511986 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.512018 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.512042 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: E0219 08:45:45.532730 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:45Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:45 crc kubenswrapper[4788]: E0219 08:45:45.532951 4788 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.535161 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.535220 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.535236 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.535300 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.535323 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.638558 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.638631 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.638674 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.638702 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.638720 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.687381 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 07:02:59.562465096 +0000 UTC Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.713937 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:45 crc kubenswrapper[4788]: E0219 08:45:45.714150 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.742106 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.742172 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.742197 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.742228 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.742287 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.844833 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.844910 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.844927 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.844951 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.844967 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.947837 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.947896 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.947914 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.947941 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:45 crc kubenswrapper[4788]: I0219 08:45:45.947960 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:45Z","lastTransitionTime":"2026-02-19T08:45:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.050698 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.050769 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.050788 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.050814 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.050834 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.154007 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.154068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.154084 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.154109 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.154126 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.257700 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.257764 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.257783 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.257808 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.257829 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.361044 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.361101 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.361118 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.361141 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.361158 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.464081 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.464145 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.464163 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.464210 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.464229 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.567214 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.567319 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.567341 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.567372 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.567393 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.670177 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.670234 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.670284 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.670315 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.670341 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.688462 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 04:00:02.347195416 +0000 UTC Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.713475 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.713551 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.713567 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:46 crc kubenswrapper[4788]: E0219 08:45:46.713716 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:46 crc kubenswrapper[4788]: E0219 08:45:46.714499 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:46 crc kubenswrapper[4788]: E0219 08:45:46.714608 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.715116 4788 scope.go:117] "RemoveContainer" containerID="20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.773763 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.774053 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.774071 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.774094 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.774111 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.875945 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.875987 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.875999 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.876015 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.876026 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.978743 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.978781 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.978790 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.978803 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:46 crc kubenswrapper[4788]: I0219 08:45:46.978812 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:46Z","lastTransitionTime":"2026-02-19T08:45:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.039452 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/1.log" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.041259 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.041385 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.080596 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.080621 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.080630 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.080640 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.080648 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.182286 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.182350 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.182375 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.182405 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.182469 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.227335 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0219 08:45:32.230354 6203 factory.go:1336] Added *v1.Pod event handler 3\\\\nI0219 08:45:32.230371 6203 admin_network_policy_controller.go:133] Setting up event handlers for Admin Network Policy\\\\nI0219 08:45:32.230414 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:32.230445 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 08:45:32.230454 6203 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0219 08:45:32.230529 6203 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.243735 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.263560 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.286418 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.286499 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.286525 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.286557 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.286581 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.296786 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.313411 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.329611 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.344216 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.360298 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.385204 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.388530 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.388578 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.388590 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.388606 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.388617 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.400385 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.428151 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.445350 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.463087 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.474601 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.484391 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.490666 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.490697 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.490708 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.490726 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.490737 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.494157 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.505552 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:47Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.593768 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.593810 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.593824 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.593841 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.593854 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.689467 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 22:47:39.387213388 +0000 UTC Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.696570 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.696625 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.696643 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.696666 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.696682 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.713900 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:47 crc kubenswrapper[4788]: E0219 08:45:47.714071 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.799343 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.799391 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.799405 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.799427 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.799450 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.902843 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.902888 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.902899 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.902918 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:47 crc kubenswrapper[4788]: I0219 08:45:47.902931 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:47Z","lastTransitionTime":"2026-02-19T08:45:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.005719 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.005786 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.005807 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.005831 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.005849 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.047024 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/2.log" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.048320 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/1.log" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.052216 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000" exitCode=1 Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.052341 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.052429 4788 scope.go:117] "RemoveContainer" containerID="20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.053663 4788 scope.go:117] "RemoveContainer" containerID="15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000" Feb 19 08:45:48 crc kubenswrapper[4788]: E0219 08:45:48.053961 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.074066 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.100907 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.108570 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.108629 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.108648 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.108673 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.108692 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.120126 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.139586 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.142523 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.159069 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.182332 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.212352 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.212405 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.212423 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.212448 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.212467 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.213933 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0219 08:45:32.230354 6203 factory.go:1336] Added *v1.Pod event handler 3\\\\nI0219 08:45:32.230371 6203 admin_network_policy_controller.go:133] Setting up event handlers for Admin Network Policy\\\\nI0219 08:45:32.230414 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:32.230445 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 08:45:32.230454 6203 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0219 08:45:32.230529 6203 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.230912 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.248852 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.285558 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.316082 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.316125 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.316136 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.316155 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.316166 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.339151 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.350737 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.373781 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.389749 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.404524 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.418877 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.418905 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.418916 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.418931 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.418941 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.421713 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.443211 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.522399 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.522623 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.522835 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.522874 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.522898 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.626075 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.626131 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.626157 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.626190 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.626216 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.690530 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 00:00:03.510499277 +0000 UTC Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.713947 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.713968 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:48 crc kubenswrapper[4788]: E0219 08:45:48.714121 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:48 crc kubenswrapper[4788]: E0219 08:45:48.714310 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.714733 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:48 crc kubenswrapper[4788]: E0219 08:45:48.715048 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.729324 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.729381 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.729402 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.729428 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.729449 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.749104 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20baba1612d11eff7762abf6ddf359eb9d0639df86f1a06f0f854a7b919f064f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0219 08:45:32.230354 6203 factory.go:1336] Added *v1.Pod event handler 3\\\\nI0219 08:45:32.230371 6203 admin_network_policy_controller.go:133] Setting up event handlers for Admin Network Policy\\\\nI0219 08:45:32.230414 6203 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:32.230445 6203 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0219 08:45:32.230454 6203 obj_retry.go:434] periodicallyRetryResources: Retry channel got triggered: retrying failed objects of type *v1.Pod\\\\nF0219 08:45:32.230529 6203 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.767971 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.784065 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.814891 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.832165 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.832230 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.832298 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.832328 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.832349 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.838931 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.854989 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.869446 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.887177 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.908483 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.925809 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.935706 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.935740 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.935752 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.935771 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.935783 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:48Z","lastTransitionTime":"2026-02-19T08:45:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.945781 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.964979 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:48 crc kubenswrapper[4788]: I0219 08:45:48.987140 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:48Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.005218 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.023800 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.037898 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.037947 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.037965 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.037989 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.038006 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.047419 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.056897 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/2.log" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.058124 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.061794 4788 scope.go:117] "RemoveContainer" containerID="15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000" Feb 19 08:45:49 crc kubenswrapper[4788]: E0219 08:45:49.061937 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.075018 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.088598 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.102364 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.116740 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.128386 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.136466 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.140006 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.140062 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.140080 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.140107 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.140125 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.148721 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.160156 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.177969 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.196684 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.211694 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.226944 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.242955 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.243029 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.243050 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.243078 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.243098 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.254577 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.270308 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.281073 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.296835 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.312141 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:49Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.346388 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.346698 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.346837 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.346988 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.347118 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.450606 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.450663 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.450680 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.450703 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.450721 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.553948 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.554034 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.554052 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.554075 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.554093 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.657830 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.657865 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.657874 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.657886 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.657895 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.691711 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 15:48:33.436452019 +0000 UTC Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.714135 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:49 crc kubenswrapper[4788]: E0219 08:45:49.714430 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.760566 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.760665 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.760686 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.760711 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.760731 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.863471 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.863537 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.863554 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.863579 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.863597 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.967297 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.967381 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.967407 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.967439 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:49 crc kubenswrapper[4788]: I0219 08:45:49.967464 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:49Z","lastTransitionTime":"2026-02-19T08:45:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.070474 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.070544 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.070571 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.070601 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.070624 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.080312 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.080517 4788 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.080616 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs podName:ad68454a-3350-49a5-9047-8b78e81ec79c nodeName:}" failed. No retries permitted until 2026-02-19 08:46:06.080588366 +0000 UTC m=+68.068599878 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs") pod "network-metrics-daemon-qbwlq" (UID: "ad68454a-3350-49a5-9047-8b78e81ec79c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.173362 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.173423 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.173441 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.173465 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.173484 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.276646 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.276716 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.276740 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.276773 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.276795 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.380346 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.380417 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.380434 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.380459 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.380475 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.434447 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.451333 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.457020 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.483091 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.483588 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.483638 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.483659 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.483683 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.483700 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.484673 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.484825 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:46:22.484793345 +0000 UTC m=+84.472804857 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.484899 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.484993 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485088 4788 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.485098 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.485139 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485164 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:46:22.485142463 +0000 UTC m=+84.473153965 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485284 4788 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485298 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485320 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485333 4788 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485342 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485388 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485408 4788 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485353 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:46:22.485337288 +0000 UTC m=+84.473348800 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485507 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 08:46:22.485481181 +0000 UTC m=+84.473492683 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.485530 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 08:46:22.485519262 +0000 UTC m=+84.473530764 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.508549 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.527759 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.547975 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.571924 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.585895 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.586707 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.586765 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.586783 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.586805 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.586824 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.602345 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.615976 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.634301 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.666145 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.681926 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.689568 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.689612 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.689626 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.689645 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.689659 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.691846 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 07:02:15.333059006 +0000 UTC Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.700978 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.713726 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.713808 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.713832 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.714011 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.714125 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:50 crc kubenswrapper[4788]: E0219 08:45:50.714280 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.730673 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.748471 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.771335 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.791039 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:50Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.792472 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.792536 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.792555 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.792580 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.792599 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.895489 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.895539 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.895555 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.895580 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.895596 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.998292 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.998362 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.998383 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.998407 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:50 crc kubenswrapper[4788]: I0219 08:45:50.998424 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:50Z","lastTransitionTime":"2026-02-19T08:45:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.100779 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.100851 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.100870 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.100893 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.100910 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.204387 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.204439 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.204450 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.204468 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.204480 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.306753 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.306826 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.306852 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.306881 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.306906 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.409617 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.409687 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.409712 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.409737 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.409762 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.513221 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.513320 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.513338 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.513365 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.513383 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.616911 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.616983 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.617003 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.617030 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.617049 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.692152 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 23:29:37.190494748 +0000 UTC Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.713772 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:51 crc kubenswrapper[4788]: E0219 08:45:51.714142 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.719929 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.719972 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.719990 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.720013 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.720030 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.823022 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.823098 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.823122 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.823153 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.823175 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.926358 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.926411 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.926428 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.926452 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:51 crc kubenswrapper[4788]: I0219 08:45:51.926469 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:51Z","lastTransitionTime":"2026-02-19T08:45:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.030780 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.030856 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.030879 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.030902 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.030920 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.134339 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.134416 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.134435 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.134461 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.134480 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.236933 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.236976 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.236987 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.237003 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.237015 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.340445 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.340481 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.340490 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.340502 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.340511 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.443190 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.443233 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.443272 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.443292 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.443307 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.546571 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.546662 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.546681 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.546705 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.546721 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.652112 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.652171 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.652188 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.652205 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.652217 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.693335 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 19:43:49.101905796 +0000 UTC Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.713976 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.714090 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:52 crc kubenswrapper[4788]: E0219 08:45:52.714321 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.714377 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:52 crc kubenswrapper[4788]: E0219 08:45:52.714513 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:52 crc kubenswrapper[4788]: E0219 08:45:52.714713 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.755581 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.755639 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.755656 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.755685 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.755710 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.859062 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.859109 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.859127 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.859150 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.859167 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.961829 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.961897 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.961915 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.961939 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:52 crc kubenswrapper[4788]: I0219 08:45:52.961958 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:52Z","lastTransitionTime":"2026-02-19T08:45:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.064660 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.064714 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.064748 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.064771 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.064786 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.167671 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.167727 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.167747 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.167772 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.167792 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.270620 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.270715 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.270733 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.270760 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.270779 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.373832 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.373874 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.373891 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.373915 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.373933 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.476886 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.477166 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.477288 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.477409 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.477506 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.581450 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.581548 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.581567 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.581590 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.581611 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.685341 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.685406 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.685424 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.685449 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.685467 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.693574 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 18:52:20.097357524 +0000 UTC Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.714013 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:53 crc kubenswrapper[4788]: E0219 08:45:53.714215 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.787453 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.787481 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.787489 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.787502 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.787510 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.890297 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.890377 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.890394 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.890416 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.890434 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.993202 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.993300 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.993325 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.993357 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:53 crc kubenswrapper[4788]: I0219 08:45:53.993378 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:53Z","lastTransitionTime":"2026-02-19T08:45:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.096762 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.096837 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.096856 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.097315 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.097365 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.200462 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.200513 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.200530 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.200601 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.200619 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.303400 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.303455 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.303472 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.303495 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.303513 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.406200 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.406299 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.406317 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.406341 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.406358 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.510828 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.510898 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.510919 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.510947 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.510968 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.614634 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.614720 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.614739 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.614767 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.614788 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.694807 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 18:53:27.693598368 +0000 UTC Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.714157 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.714205 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:54 crc kubenswrapper[4788]: E0219 08:45:54.714402 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.714458 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:54 crc kubenswrapper[4788]: E0219 08:45:54.714579 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:54 crc kubenswrapper[4788]: E0219 08:45:54.714650 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.717678 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.717707 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.717715 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.717726 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.717735 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.819916 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.819966 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.819982 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.820003 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.820018 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.923006 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.923059 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.923071 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.923089 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:54 crc kubenswrapper[4788]: I0219 08:45:54.923101 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:54Z","lastTransitionTime":"2026-02-19T08:45:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.026153 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.026232 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.026285 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.026317 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.026341 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.129786 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.129865 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.129890 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.129926 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.129948 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.233770 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.233899 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.233919 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.233984 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.234002 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.337489 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.337557 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.337576 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.337603 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.337621 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.440763 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.440822 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.440835 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.440854 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.440866 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.543728 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.543790 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.543809 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.543838 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.543856 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.646356 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.646406 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.646418 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.646435 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.646449 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.695314 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 16:15:58.050893215 +0000 UTC Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.713397 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:55 crc kubenswrapper[4788]: E0219 08:45:55.713597 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.749693 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.749780 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.749798 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.749821 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.749840 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.852726 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.852785 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.852801 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.852826 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.852842 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.905820 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.905852 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.905879 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.905893 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.905903 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: E0219 08:45:55.928794 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.934958 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.934983 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.934995 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.935013 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.935026 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: E0219 08:45:55.955624 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.960980 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.961053 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.961073 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.961099 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.961119 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:55 crc kubenswrapper[4788]: E0219 08:45:55.982149 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:55Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.986721 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.986773 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.986793 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.986821 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:55 crc kubenswrapper[4788]: I0219 08:45:55.986838 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:55Z","lastTransitionTime":"2026-02-19T08:45:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: E0219 08:45:56.007627 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.013355 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.013403 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.013414 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.013437 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.013449 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: E0219 08:45:56.032635 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:45:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:56Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:56 crc kubenswrapper[4788]: E0219 08:45:56.032856 4788 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.034984 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.035039 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.035059 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.035081 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.035099 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.137838 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.137905 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.137922 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.137945 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.137963 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.241018 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.241097 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.241135 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.241172 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.241194 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.344516 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.344576 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.344593 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.344617 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.344635 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.447953 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.448016 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.448034 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.448059 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.448076 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.551730 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.551808 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.551834 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.551864 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.551887 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.654598 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.654668 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.654690 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.654722 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.654745 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.696353 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:11:15.518157738 +0000 UTC Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.713990 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.714226 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:56 crc kubenswrapper[4788]: E0219 08:45:56.714388 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.714447 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:56 crc kubenswrapper[4788]: E0219 08:45:56.714686 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:56 crc kubenswrapper[4788]: E0219 08:45:56.714892 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.757732 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.757812 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.757839 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.757865 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.757886 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.860891 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.860962 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.860981 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.861004 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.861022 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.964087 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.964144 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.964161 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.964186 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:56 crc kubenswrapper[4788]: I0219 08:45:56.964205 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:56Z","lastTransitionTime":"2026-02-19T08:45:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.066740 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.066800 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.066819 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.066845 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.066863 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.169113 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.169191 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.169214 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.169283 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.169309 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.272597 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.272654 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.272672 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.272697 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.272714 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.375299 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.375340 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.375348 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.375363 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.375375 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.478318 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.478841 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.478866 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.478883 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.478895 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.582036 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.582129 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.582154 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.582187 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.582209 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.686028 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.686101 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.686118 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.686141 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.686159 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.697126 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 21:50:34.132610753 +0000 UTC Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.713831 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:57 crc kubenswrapper[4788]: E0219 08:45:57.714002 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.790290 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.790349 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.790367 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.790397 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.790415 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.894166 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.894222 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.894239 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.894300 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.894318 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.996626 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.996687 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.996710 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.996739 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:57 crc kubenswrapper[4788]: I0219 08:45:57.996762 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:57Z","lastTransitionTime":"2026-02-19T08:45:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.099683 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.099770 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.099807 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.099836 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.099857 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.203154 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.203222 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.203240 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.203323 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.203352 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.305581 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.305631 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.305649 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.305674 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.305694 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.408690 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.408766 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.408786 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.408812 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.408831 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.511660 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.511753 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.511773 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.511798 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.511820 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.615538 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.615637 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.615679 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.616764 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.616791 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.698134 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 03:53:45.171194306 +0000 UTC Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.713500 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.713526 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.713587 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:45:58 crc kubenswrapper[4788]: E0219 08:45:58.714197 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:45:58 crc kubenswrapper[4788]: E0219 08:45:58.714363 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:45:58 crc kubenswrapper[4788]: E0219 08:45:58.714492 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.721085 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.721145 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.721163 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.721188 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.721205 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.740140 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.765227 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.786829 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.808476 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.827109 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.827394 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.827414 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.827879 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.828118 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.838958 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.857659 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.872148 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61ae5a8b-ebc9-4f33-a0a5-d0e94443363f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68431cd65fb593de62afcc19c22d4e3f8d8da669e9a1fa22c820abbbf27585a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefb227dd2add39a5e4c83674ebe19e375697257f8bc8c29a15e951e49650be4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e5428237ae5741c34aae6754486baedada4ba34c8d5134ea392c6d54698836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.887557 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.905723 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.926700 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.930836 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.931094 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.931320 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.931500 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.931664 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:58Z","lastTransitionTime":"2026-02-19T08:45:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.948057 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.978773 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:58 crc kubenswrapper[4788]: I0219 08:45:58.999198 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:58Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.016558 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.034979 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.035035 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.035051 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.035077 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.035097 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.050176 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.075146 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.093451 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.118727 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:45:59Z is after 2025-08-24T17:21:41Z" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.137379 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.137443 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.137466 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.137499 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.137523 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.241896 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.241972 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.241992 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.242021 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.242040 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.346500 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.346593 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.346618 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.346653 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.346679 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.450205 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.450360 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.450386 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.450424 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.450445 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.554186 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.554318 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.554340 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.554366 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.554384 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.657963 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.658035 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.658054 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.658086 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.658104 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.699328 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 10:48:19.769189702 +0000 UTC Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.713881 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:45:59 crc kubenswrapper[4788]: E0219 08:45:59.714122 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.765731 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.765803 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.765825 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.765854 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.765882 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.869388 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.869456 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.869474 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.869504 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.869523 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.972885 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.972951 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.972969 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.972995 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:45:59 crc kubenswrapper[4788]: I0219 08:45:59.973014 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:45:59Z","lastTransitionTime":"2026-02-19T08:45:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.075945 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.076017 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.076039 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.076068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.076086 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.179401 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.179447 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.179457 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.179477 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.179490 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.283339 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.283413 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.283431 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.283459 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.283477 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.386885 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.386966 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.386979 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.387038 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.387051 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.490200 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.490288 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.490310 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.490334 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.490353 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.593185 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.593239 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.593280 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.593302 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.593319 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.695597 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.695676 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.695702 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.695738 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.695759 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.699776 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 23:52:03.367380429 +0000 UTC Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.714609 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.714720 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.714640 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:00 crc kubenswrapper[4788]: E0219 08:46:00.714840 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:00 crc kubenswrapper[4788]: E0219 08:46:00.715107 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:00 crc kubenswrapper[4788]: E0219 08:46:00.715183 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.799359 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.799428 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.799448 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.799477 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.799497 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.902708 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.902776 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.902795 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.902820 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:00 crc kubenswrapper[4788]: I0219 08:46:00.902841 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:00Z","lastTransitionTime":"2026-02-19T08:46:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.005937 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.006014 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.006031 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.006058 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.006077 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.108878 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.108940 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.108961 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.108989 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.109024 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.212287 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.212353 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.212375 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.212402 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.212421 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.314946 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.315014 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.315033 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.315059 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.315076 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.417806 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.417845 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.417862 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.417882 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.417893 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.520792 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.520859 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.520868 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.520893 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.520935 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.624187 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.624277 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.624304 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.624335 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.624357 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.700949 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 14:02:03.698099774 +0000 UTC Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.714314 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:01 crc kubenswrapper[4788]: E0219 08:46:01.715151 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.728959 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.729050 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.729068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.729103 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.729122 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.832141 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.832213 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.832236 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.832321 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.832346 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.936187 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.936292 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.936315 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.936343 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:01 crc kubenswrapper[4788]: I0219 08:46:01.936360 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:01Z","lastTransitionTime":"2026-02-19T08:46:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.039745 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.039826 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.039846 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.039880 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.039902 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.143330 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.143412 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.143432 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.143461 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.143480 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.247350 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.247429 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.247452 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.247488 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.247513 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.354616 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.354693 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.354714 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.354755 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.354784 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.458270 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.458304 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.458314 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.458331 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.458342 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.561283 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.561354 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.561371 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.561397 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.561415 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.664505 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.664545 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.664556 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.664573 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.664589 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.702156 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 10:43:50.855585127 +0000 UTC Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.713630 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.713655 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.713731 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:02 crc kubenswrapper[4788]: E0219 08:46:02.713800 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:02 crc kubenswrapper[4788]: E0219 08:46:02.713925 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:02 crc kubenswrapper[4788]: E0219 08:46:02.714144 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.767588 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.767630 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.767640 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.767659 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.767672 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.870835 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.870907 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.870926 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.870953 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.870972 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.973030 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.973096 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.973115 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.973143 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:02 crc kubenswrapper[4788]: I0219 08:46:02.973161 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:02Z","lastTransitionTime":"2026-02-19T08:46:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.075943 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.076045 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.076070 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.076100 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.076122 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.178355 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.178412 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.178425 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.178445 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.178459 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.281918 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.281969 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.281981 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.282003 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.282017 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.384737 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.384798 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.384814 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.384838 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.384854 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.487729 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.487783 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.487794 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.487813 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.487826 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.590464 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.590563 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.590581 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.590606 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.590628 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.693705 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.693792 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.693815 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.693849 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.693872 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.702885 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 03:51:00.261898807 +0000 UTC Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.714310 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:03 crc kubenswrapper[4788]: E0219 08:46:03.714986 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.715387 4788 scope.go:117] "RemoveContainer" containerID="15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000" Feb 19 08:46:03 crc kubenswrapper[4788]: E0219 08:46:03.715742 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.797283 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.797352 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.797374 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.797405 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.797433 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.899938 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.899984 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.899995 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.900012 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:03 crc kubenswrapper[4788]: I0219 08:46:03.900025 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:03Z","lastTransitionTime":"2026-02-19T08:46:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.002856 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.002889 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.002898 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.002911 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.002920 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.105428 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.105480 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.105497 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.105523 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.105540 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.208330 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.208384 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.208396 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.208414 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.208427 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.310839 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.310900 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.310913 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.310935 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.310948 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.413878 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.413921 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.413937 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.413962 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.413981 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.516945 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.517016 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.517039 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.517069 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.517091 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.620100 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.620143 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.620155 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.620173 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.620189 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.703550 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 19:46:16.231052197 +0000 UTC Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.713954 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.714044 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.714298 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:04 crc kubenswrapper[4788]: E0219 08:46:04.714290 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:04 crc kubenswrapper[4788]: E0219 08:46:04.714426 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:04 crc kubenswrapper[4788]: E0219 08:46:04.714648 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.722036 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.722106 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.722116 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.722147 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.722170 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.825188 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.825288 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.825308 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.825337 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.825359 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.928378 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.928418 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.928427 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.928442 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:04 crc kubenswrapper[4788]: I0219 08:46:04.928452 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:04Z","lastTransitionTime":"2026-02-19T08:46:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.031347 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.031386 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.031399 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.031416 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.031427 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.133764 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.133798 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.133808 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.133824 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.133834 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.236845 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.236933 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.236954 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.237144 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.237169 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.340137 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.340213 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.340235 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.340300 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.340325 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.443551 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.443625 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.443645 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.443674 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.443695 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.546725 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.546772 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.546781 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.546799 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.546815 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.648964 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.649000 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.649011 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.649026 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.649036 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.704215 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 02:12:40.191856864 +0000 UTC Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.714217 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:05 crc kubenswrapper[4788]: E0219 08:46:05.714383 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.751341 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.751370 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.751379 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.751394 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.751403 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.853602 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.853649 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.853658 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.853674 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.853685 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.955802 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.955850 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.955860 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.955877 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:05 crc kubenswrapper[4788]: I0219 08:46:05.955887 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:05Z","lastTransitionTime":"2026-02-19T08:46:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.058537 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.058590 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.058600 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.058619 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.058630 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.161175 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.161220 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.161233 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.161270 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.161284 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.168788 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.168920 4788 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.168994 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs podName:ad68454a-3350-49a5-9047-8b78e81ec79c nodeName:}" failed. No retries permitted until 2026-02-19 08:46:38.168974819 +0000 UTC m=+100.156986291 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs") pod "network-metrics-daemon-qbwlq" (UID: "ad68454a-3350-49a5-9047-8b78e81ec79c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.263664 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.263698 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.263706 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.263720 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.263729 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.356147 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.356184 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.356193 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.356209 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.356219 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.367490 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.370725 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.370758 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.370768 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.370784 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.370797 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.383160 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.386485 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.386515 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.386524 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.386538 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.386550 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.400862 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.404163 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.404235 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.404309 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.404354 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.404418 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.417144 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.421150 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.421195 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.421204 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.421220 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.421231 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.438150 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:06Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.438268 4788 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.439589 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.439652 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.439665 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.439686 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.439719 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.543152 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.543207 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.543226 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.543288 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.543317 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.646556 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.646606 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.646615 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.646631 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.646642 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.704515 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 19:56:50.077659068 +0000 UTC Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.713816 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.713878 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.713956 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.714031 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.714126 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:06 crc kubenswrapper[4788]: E0219 08:46:06.714363 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.748503 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.748562 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.748577 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.748599 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.748613 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.851835 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.851885 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.851894 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.851912 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.851926 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.954457 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.954497 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.954506 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.954526 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:06 crc kubenswrapper[4788]: I0219 08:46:06.954535 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:06Z","lastTransitionTime":"2026-02-19T08:46:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.056979 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.057054 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.057065 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.057082 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.057093 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.160397 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.160459 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.160477 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.160506 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.160525 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.263396 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.263454 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.263472 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.263498 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.263520 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.366164 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.366230 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.366324 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.366361 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.366384 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.469560 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.469631 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.469651 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.469683 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.469702 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.572822 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.572879 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.572889 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.572906 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.572917 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.675541 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.675584 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.675598 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.675942 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.675960 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.705684 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 19:43:56.495587392 +0000 UTC Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.713998 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:07 crc kubenswrapper[4788]: E0219 08:46:07.714168 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.779120 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.779191 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.779207 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.779233 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.779322 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.881313 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.881348 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.881357 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.881373 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.881383 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.983541 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.983575 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.983584 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.983598 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:07 crc kubenswrapper[4788]: I0219 08:46:07.983609 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:07Z","lastTransitionTime":"2026-02-19T08:46:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.087328 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.087416 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.087438 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.087465 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.087484 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.132151 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/0.log" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.132207 4788 generic.go:334] "Generic (PLEG): container finished" podID="a5c26787-29de-439a-86b8-920cac6c8ab8" containerID="217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0" exitCode=1 Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.132283 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hxf6" event={"ID":"a5c26787-29de-439a-86b8-920cac6c8ab8","Type":"ContainerDied","Data":"217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.132845 4788 scope.go:117] "RemoveContainer" containerID="217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.157809 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.173446 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.188334 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.192355 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.192421 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.192445 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.192477 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.192504 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.201444 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"2026-02-19T08:45:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695\\\\n2026-02-19T08:45:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695 to /host/opt/cni/bin/\\\\n2026-02-19T08:45:23Z [verbose] multus-daemon started\\\\n2026-02-19T08:45:23Z [verbose] Readiness Indicator file check\\\\n2026-02-19T08:46:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.221657 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.235778 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.253603 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61ae5a8b-ebc9-4f33-a0a5-d0e94443363f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68431cd65fb593de62afcc19c22d4e3f8d8da669e9a1fa22c820abbbf27585a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefb227dd2add39a5e4c83674ebe19e375697257f8bc8c29a15e951e49650be4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e5428237ae5741c34aae6754486baedada4ba34c8d5134ea392c6d54698836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.271641 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.285764 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.295037 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.295091 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.295106 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.295147 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.295165 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.309975 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.328972 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.338572 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.350747 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.372396 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.386811 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.398329 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.398396 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.398414 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.398440 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.398459 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.399624 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.417953 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.433715 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.500972 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.501048 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.501072 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.501103 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.501125 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.603684 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.603734 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.603746 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.603765 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.603778 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.705798 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 07:39:57.63098276 +0000 UTC Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.706804 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.706854 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.706874 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.706904 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.706922 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.714291 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.714364 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:08 crc kubenswrapper[4788]: E0219 08:46:08.714478 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:08 crc kubenswrapper[4788]: E0219 08:46:08.714705 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.714931 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:08 crc kubenswrapper[4788]: E0219 08:46:08.715111 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.736206 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.752760 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.775158 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.790292 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.807104 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.809030 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.809078 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.809091 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.809113 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.809155 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.822634 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"2026-02-19T08:45:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695\\\\n2026-02-19T08:45:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695 to /host/opt/cni/bin/\\\\n2026-02-19T08:45:23Z [verbose] multus-daemon started\\\\n2026-02-19T08:45:23Z [verbose] Readiness Indicator file check\\\\n2026-02-19T08:46:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.839670 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.852790 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.866291 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61ae5a8b-ebc9-4f33-a0a5-d0e94443363f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68431cd65fb593de62afcc19c22d4e3f8d8da669e9a1fa22c820abbbf27585a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefb227dd2add39a5e4c83674ebe19e375697257f8bc8c29a15e951e49650be4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e5428237ae5741c34aae6754486baedada4ba34c8d5134ea392c6d54698836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.879093 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.891769 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.911568 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.911873 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.911911 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.911927 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.911950 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.911967 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:08Z","lastTransitionTime":"2026-02-19T08:46:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.927066 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.938925 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.951666 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.970624 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.986306 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:08 crc kubenswrapper[4788]: I0219 08:46:08.998096 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:08Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.014341 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.014436 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.014464 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.014500 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.014528 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.116845 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.116899 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.116914 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.116932 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.116943 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.137554 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/0.log" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.137651 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hxf6" event={"ID":"a5c26787-29de-439a-86b8-920cac6c8ab8","Type":"ContainerStarted","Data":"13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.154693 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.174181 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.191992 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.205484 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.219627 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.219667 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.219679 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.219703 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.219714 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.219871 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61ae5a8b-ebc9-4f33-a0a5-d0e94443363f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68431cd65fb593de62afcc19c22d4e3f8d8da669e9a1fa22c820abbbf27585a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefb227dd2add39a5e4c83674ebe19e375697257f8bc8c29a15e951e49650be4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e5428237ae5741c34aae6754486baedada4ba34c8d5134ea392c6d54698836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.232449 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.249681 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.265107 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.279549 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"2026-02-19T08:45:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695\\\\n2026-02-19T08:45:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695 to /host/opt/cni/bin/\\\\n2026-02-19T08:45:23Z [verbose] multus-daemon started\\\\n2026-02-19T08:45:23Z [verbose] Readiness Indicator file check\\\\n2026-02-19T08:46:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:46:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.293084 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.303927 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.321697 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.321729 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.321737 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.321753 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.321763 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.323204 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.334589 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.344946 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.365029 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.378549 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.387761 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.397298 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:09Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.424163 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.424216 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.424233 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.424280 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.424301 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.526444 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.526475 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.526484 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.526495 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.526503 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.628897 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.628930 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.628941 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.628954 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.628963 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.706809 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 01:47:54.77850512 +0000 UTC Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.714220 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:09 crc kubenswrapper[4788]: E0219 08:46:09.714378 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.730912 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.731002 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.731032 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.731068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.731087 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.834102 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.834170 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.834189 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.834283 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.834303 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.937985 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.938075 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.938100 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.938130 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:09 crc kubenswrapper[4788]: I0219 08:46:09.938155 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:09Z","lastTransitionTime":"2026-02-19T08:46:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.040346 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.040389 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.040398 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.040413 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.040423 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.142807 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.142877 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.142897 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.142923 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.142941 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.246227 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.246285 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.246295 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.246313 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.246323 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.349103 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.349150 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.349160 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.349175 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.349188 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.452360 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.452414 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.452426 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.452445 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.452458 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.554811 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.554855 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.554868 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.554885 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.554896 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.657609 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.657649 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.657659 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.657676 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.657687 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.707193 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 03:42:08.819643289 +0000 UTC Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.713559 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:10 crc kubenswrapper[4788]: E0219 08:46:10.713704 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.713743 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.713816 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:10 crc kubenswrapper[4788]: E0219 08:46:10.713878 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:10 crc kubenswrapper[4788]: E0219 08:46:10.714042 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.760975 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.761149 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.761214 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.761346 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.761447 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.864830 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.864888 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.864906 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.864933 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.864952 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.967493 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.967543 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.967553 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.967571 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:10 crc kubenswrapper[4788]: I0219 08:46:10.967583 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:10Z","lastTransitionTime":"2026-02-19T08:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.070126 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.070185 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.070201 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.070227 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.070273 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.172971 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.173068 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.173094 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.173133 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.173152 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.276026 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.276074 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.276083 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.276099 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.276110 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.378901 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.378940 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.378954 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.378970 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.378985 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.482648 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.482712 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.482729 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.482756 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.482774 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.586112 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.586159 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.586172 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.586190 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.586204 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.689651 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.689699 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.689709 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.689727 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.689737 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.707958 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 14:40:11.258392672 +0000 UTC Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.713352 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:11 crc kubenswrapper[4788]: E0219 08:46:11.713571 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.792549 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.792641 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.792669 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.792704 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.792728 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.895343 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.895407 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.895423 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.895447 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.895463 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.998353 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.998421 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.998435 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.998458 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:11 crc kubenswrapper[4788]: I0219 08:46:11.998472 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:11Z","lastTransitionTime":"2026-02-19T08:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.101238 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.101314 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.101326 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.101346 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.101360 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.204930 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.204993 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.205010 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.205037 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.205056 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.307414 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.307476 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.307495 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.307518 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.307538 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.410738 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.410813 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.410830 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.410860 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.410879 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.514216 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.514334 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.514354 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.514389 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.514427 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.616774 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.616846 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.616862 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.616883 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.616896 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.709041 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 01:47:41.482166851 +0000 UTC Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.713608 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.713674 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.713828 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:12 crc kubenswrapper[4788]: E0219 08:46:12.714060 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:12 crc kubenswrapper[4788]: E0219 08:46:12.714233 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:12 crc kubenswrapper[4788]: E0219 08:46:12.714403 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.719881 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.719954 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.719977 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.720010 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.720034 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.823318 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.823382 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.823407 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.823440 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.823467 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.925811 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.925879 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.925896 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.925921 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:12 crc kubenswrapper[4788]: I0219 08:46:12.925944 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:12Z","lastTransitionTime":"2026-02-19T08:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.029116 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.029183 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.029199 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.029226 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.029269 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.132113 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.132171 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.132183 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.132201 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.132213 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.234509 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.234560 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.234571 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.234594 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.234621 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.338927 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.339007 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.339026 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.339052 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.339071 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.442347 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.442413 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.442431 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.442457 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.442475 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.545118 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.545155 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.545164 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.545181 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.545194 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.648234 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.648351 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.648369 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.648399 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.648424 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.709183 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 18:39:38.286391789 +0000 UTC Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.713639 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:13 crc kubenswrapper[4788]: E0219 08:46:13.713876 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.752267 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.752301 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.752314 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.752328 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.752341 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.854419 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.854488 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.854511 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.854540 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.854562 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.957787 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.957834 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.957847 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.957863 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:13 crc kubenswrapper[4788]: I0219 08:46:13.957872 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:13Z","lastTransitionTime":"2026-02-19T08:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.061960 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.062026 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.062044 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.062070 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.062088 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.164718 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.164779 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.164797 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.164821 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.164837 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.268219 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.268328 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.268348 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.268376 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.268398 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.371450 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.371528 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.371551 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.371618 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.371643 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.474540 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.474591 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.474600 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.474618 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.474630 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.577844 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.577900 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.577917 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.577938 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.577956 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.680817 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.680860 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.680870 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.680885 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.680896 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.709613 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 14:14:33.055448174 +0000 UTC Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.713985 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.714114 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:14 crc kubenswrapper[4788]: E0219 08:46:14.714203 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.714322 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:14 crc kubenswrapper[4788]: E0219 08:46:14.714481 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:14 crc kubenswrapper[4788]: E0219 08:46:14.714709 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.784514 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.784582 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.784604 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.784640 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.784666 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.889557 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.889621 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.889639 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.889668 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.889686 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.992409 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.992453 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.992467 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.992493 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:14 crc kubenswrapper[4788]: I0219 08:46:14.992506 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:14Z","lastTransitionTime":"2026-02-19T08:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.095483 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.095521 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.095529 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.095554 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.095564 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.198604 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.198665 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.198682 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.198708 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.198725 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.302094 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.302152 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.302168 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.302195 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.302218 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.405305 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.405372 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.405390 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.405414 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.405433 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.508632 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.508703 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.508731 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.508757 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.508774 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.612375 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.612472 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.612498 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.612534 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.612562 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.709729 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 17:57:23.048937271 +0000 UTC Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.713396 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:15 crc kubenswrapper[4788]: E0219 08:46:15.713655 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.715871 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.715929 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.715954 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.715984 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.716010 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.819058 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.819171 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.819196 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.819231 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.819296 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.922827 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.922913 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.922934 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.922966 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:15 crc kubenswrapper[4788]: I0219 08:46:15.922988 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:15Z","lastTransitionTime":"2026-02-19T08:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.027196 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.027301 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.027320 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.027346 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.027364 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.130963 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.131025 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.131045 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.131073 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.131093 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.234767 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.234888 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.234924 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.234958 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.234982 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.338719 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.339064 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.339210 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.339383 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.339525 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.443288 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.443387 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.443413 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.443454 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.443479 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.449038 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.449107 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.449131 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.449161 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.449181 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.466718 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:16Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.471505 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.471645 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.471744 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.471840 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.471976 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.489791 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:16Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.494929 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.495113 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.495251 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.495443 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.495584 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.515640 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:16Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.520848 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.520915 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.520939 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.520971 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.520993 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.541797 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:16Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.547156 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.547222 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.547242 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.547309 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.547328 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.571451 4788 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9f17f6ab-a8e6-46f8-93e5-a456adb8cae3\\\",\\\"systemUUID\\\":\\\"24e72cbc-0955-41b3-bfe8-a41d7b46c663\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:16Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.571785 4788 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.574738 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.574808 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.574833 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.574867 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.574889 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.677476 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.677532 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.677549 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.677574 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.677593 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.710496 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 18:20:48.435628685 +0000 UTC Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.714079 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.714091 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.714308 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.714365 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.714514 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:16 crc kubenswrapper[4788]: E0219 08:46:16.714741 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.780746 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.780827 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.780853 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.780888 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.780908 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.883758 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.883810 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.883827 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.883850 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.883867 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.986550 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.986595 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.986612 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.986635 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:16 crc kubenswrapper[4788]: I0219 08:46:16.986653 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:16Z","lastTransitionTime":"2026-02-19T08:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.090058 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.090104 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.090119 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.090141 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.090158 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.193648 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.193717 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.193739 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.193770 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.193793 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.296634 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.296698 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.296718 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.296759 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.296780 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.401342 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.401429 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.401450 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.401484 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.401511 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.504435 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.504505 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.504522 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.504554 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.504577 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.607421 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.607493 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.607513 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.607555 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.607595 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.710676 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 11:37:39.267894329 +0000 UTC Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.711796 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.711866 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.711885 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.711913 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.711931 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.714019 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:17 crc kubenswrapper[4788]: E0219 08:46:17.714202 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.715374 4788 scope.go:117] "RemoveContainer" containerID="15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.814662 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.814894 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.815055 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.815193 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.815352 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.919237 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.919613 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.920100 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.920208 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:17 crc kubenswrapper[4788]: I0219 08:46:17.920404 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:17Z","lastTransitionTime":"2026-02-19T08:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.023885 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.023938 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.023956 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.023981 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.024000 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.126327 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.126386 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.126403 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.126428 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.126449 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.172460 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/2.log" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.177143 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.178155 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.213300 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.229508 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.229556 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.229568 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.229590 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.229606 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.237693 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.263720 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.294520 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.316922 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.332371 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.332745 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.332779 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.332820 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.332840 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.332853 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.344474 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.356682 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.371762 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.385742 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.398891 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.413705 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"2026-02-19T08:45:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695\\\\n2026-02-19T08:45:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695 to /host/opt/cni/bin/\\\\n2026-02-19T08:45:23Z [verbose] multus-daemon started\\\\n2026-02-19T08:45:23Z [verbose] Readiness Indicator file check\\\\n2026-02-19T08:46:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:46:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.432806 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.435336 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.435399 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.435412 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.435429 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.435442 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.446568 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.463644 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61ae5a8b-ebc9-4f33-a0a5-d0e94443363f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68431cd65fb593de62afcc19c22d4e3f8d8da669e9a1fa22c820abbbf27585a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefb227dd2add39a5e4c83674ebe19e375697257f8bc8c29a15e951e49650be4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e5428237ae5741c34aae6754486baedada4ba34c8d5134ea392c6d54698836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.487357 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.502615 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.516235 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.537658 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.537695 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.537707 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.537728 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.537740 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.641646 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.641709 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.641728 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.641758 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.641776 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.711491 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:39:17.37291927 +0000 UTC Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.713989 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:18 crc kubenswrapper[4788]: E0219 08:46:18.714206 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.714377 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:18 crc kubenswrapper[4788]: E0219 08:46:18.714600 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.714767 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:18 crc kubenswrapper[4788]: E0219 08:46:18.714908 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.735941 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61ae5a8b-ebc9-4f33-a0a5-d0e94443363f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68431cd65fb593de62afcc19c22d4e3f8d8da669e9a1fa22c820abbbf27585a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefb227dd2add39a5e4c83674ebe19e375697257f8bc8c29a15e951e49650be4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e5428237ae5741c34aae6754486baedada4ba34c8d5134ea392c6d54698836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.745148 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.745209 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.745233 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.745307 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.745329 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.761625 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.782137 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.804416 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.827724 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"2026-02-19T08:45:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695\\\\n2026-02-19T08:45:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695 to /host/opt/cni/bin/\\\\n2026-02-19T08:45:23Z [verbose] multus-daemon started\\\\n2026-02-19T08:45:23Z [verbose] Readiness Indicator file check\\\\n2026-02-19T08:46:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:46:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.847527 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.847581 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.847600 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.847625 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.847643 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.862509 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.891171 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.932715 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.950225 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.950273 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.950286 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.950306 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.950318 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:18Z","lastTransitionTime":"2026-02-19T08:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.955815 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:18 crc kubenswrapper[4788]: I0219 08:46:18.971100 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:18Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.008075 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.021192 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.035437 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.050045 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.053336 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.053364 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.053372 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.053387 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.053397 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.064669 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.081023 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.102646 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.120540 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:19Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.156604 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.156687 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.156710 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.156747 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.156770 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.259702 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.259780 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.259805 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.259836 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.259859 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.365171 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.365228 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.365240 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.365283 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.365300 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.467690 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.467735 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.467747 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.467766 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.467778 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.570815 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.570887 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.570910 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.570940 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.570965 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.673599 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.673671 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.673690 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.673717 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.673735 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.712104 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 03:38:20.964411316 +0000 UTC Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.713421 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:19 crc kubenswrapper[4788]: E0219 08:46:19.713616 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.776753 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.776829 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.776847 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.776874 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.776893 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.880463 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.880540 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.880565 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.880594 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.880618 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.983283 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.983368 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.983394 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.983426 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:19 crc kubenswrapper[4788]: I0219 08:46:19.983495 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:19Z","lastTransitionTime":"2026-02-19T08:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.087005 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.087106 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.087128 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.087157 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.087177 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.186203 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/3.log" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.187193 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/2.log" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.189811 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.189868 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.189884 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.189914 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.189932 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.191808 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" exitCode=1 Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.191875 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.191939 4788 scope.go:117] "RemoveContainer" containerID="15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.193034 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:46:20 crc kubenswrapper[4788]: E0219 08:46:20.193336 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.219924 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5eb9b73e-bd92-4b79-97c3-d9b9955f8375\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-19T08:45:12Z\\\",\\\"message\\\":\\\"W0219 08:45:01.880057 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0219 08:45:01.880478 1 crypto.go:601] Generating new CA for check-endpoints-signer@1771490701 cert, and key in /tmp/serving-cert-2058713600/serving-signer.crt, /tmp/serving-cert-2058713600/serving-signer.key\\\\nI0219 08:45:02.093824 1 observer_polling.go:159] Starting file observer\\\\nW0219 08:45:02.098721 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0219 08:45:02.098946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0219 08:45:02.101283 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2058713600/tls.crt::/tmp/serving-cert-2058713600/tls.key\\\\\\\"\\\\nF0219 08:45:12.576835 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.241164 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3545a7a-dc11-4c76-ac33-a776f41e0612\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2bb774bcff726e3b92a9aec7168941586e233feb835379a341eebcd8ad9795f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://728328649a88a94b67a5e0a873f7ac7fc0e29af75a18a83c551ad066179c6753\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://926c5092d0d41ce02817715b370f4934cf8154bd98ab40963f37ef6c61a96ee0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.258071 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f0668cf4b0a146aa928f3312a311448f63f90d260e790af25504c28143a305e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.277377 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.293295 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.293373 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.293399 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.293428 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.293449 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.294409 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6lplm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93e79622-b196-49af-8474-1d25444de3ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2c17e784b7c585de025e710e3fe7b03ca2e61928725771b9ba47c14f3b99917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6z85l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6lplm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.311348 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"61ae5a8b-ebc9-4f33-a0a5-d0e94443363f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68431cd65fb593de62afcc19c22d4e3f8d8da669e9a1fa22c820abbbf27585a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eefb227dd2add39a5e4c83674ebe19e375697257f8bc8c29a15e951e49650be4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55e5428237ae5741c34aae6754486baedada4ba34c8d5134ea392c6d54698836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81b5724006beabf7397d935275d6c71039fea9073db4999df56f2e05aaa40dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.330551 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad84ab31c20b3a0c7e10a5b3588855c0da3f30eb62c14f626a31f4af558f1fe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2713eb95e0d8843526f67d671da9c2b5c6f2847eb83bd4a8c1dbc9650109a447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.348959 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.367136 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.386578 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9hxf6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5c26787-29de-439a-86b8-920cac6c8ab8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:46:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:46:08Z\\\",\\\"message\\\":\\\"2026-02-19T08:45:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695\\\\n2026-02-19T08:45:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_240928f7-9960-41be-988b-98b64eff6695 to /host/opt/cni/bin/\\\\n2026-02-19T08:45:23Z [verbose] multus-daemon started\\\\n2026-02-19T08:45:23Z [verbose] Readiness Indicator file check\\\\n2026-02-19T08:46:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:46:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9hxf6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.396093 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.396138 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.396155 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.396181 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.396198 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.409375 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a76d0d1-0c0d-47fa-952a-fe34687e34ca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f8d27df679e40cd3db81886f09c6f936b44cc62c3a9bd571d1c693351df5cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e644c4ba42af78a0edec45e74eaa7f9abe6a7222bf2a5f3ce1e9149ca2d01d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58d8a766ece39ed21e84f12d0aab64c789753b44c34c40f0b2d73c5452f68250\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://858d895887a4bf6faa601c7bfb7bc0d9d9d4bdf56b105963e858bea34b500868\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8911452b31ddcc6b9256503ede3c283d06a089be961c79e9bb95457079320dc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0b1883133708a6de07c34a7ed330b5d9e1bfad07cfdafb7ff07b878004ad3f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://914976f74798c92385a11186d07d769cebbb3aed9b65330d49dc003f11c3ed85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jj598\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-7s4rp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.442565 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd5c1c46-74a4-41f4-ad05-af438781bd6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://15a10b1c548fcc2b0c4f0d034c6bb8c45c36c415ae78a5c8ea1d72ec37232000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:45:47Z\\\",\\\"message\\\":\\\"erved-pods:v4/a13607449821398607916) with []\\\\nI0219 08:45:47.877318 6413 factory.go:1336] Added *v1.Node event handler 7\\\\nI0219 08:45:47.877447 6413 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0219 08:45:47.877468 6413 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:45:47.877609 6413 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:45:47.877679 6413 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:45:47.877694 6413 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:45:47.877714 6413 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:45:47.877723 6413 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:45:47.877746 6413 factory.go:656] Stopping watch factory\\\\nI0219 08:45:47.877771 6413 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:45:47.878078 6413 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0219 08:45:47.878233 6413 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0219 08:45:47.878365 6413 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:45:47.878438 6413 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0219 08:45:47.878574 6413 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-19T08:46:19Z\\\",\\\"message\\\":\\\"ute/v1/apis/informers/externalversions/factory.go:140\\\\nI0219 08:46:18.784413 6811 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0219 08:46:18.784494 6811 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0219 08:46:18.784522 6811 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0219 08:46:18.784533 6811 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0219 08:46:18.784556 6811 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0219 08:46:18.784567 6811 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0219 08:46:18.784649 6811 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0219 08:46:18.784707 6811 factory.go:656] Stopping watch factory\\\\nI0219 08:46:18.784738 6811 ovnkube.go:599] Stopped ovnkube\\\\nI0219 08:46:18.784788 6811 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0219 08:46:18.784814 6811 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0219 08:46:18.784830 6811 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0219 08:46:18.784847 6811 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0219 08:46:18.784864 6811 handler.go:208] Removed *v1.Node event handler 2\\\\nI0219 08:46:18.784880 6811 handler.go:208] Removed *v1.Node event handler 7\\\\nI0219 08:46:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-19T08:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f6gjm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xmshh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.461022 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"798f2a0d-d6be-46a9-83e5-67a10abcce47\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1589aeda802ddf4aa12c019aa980e337896c18f1edd1eeedb23603cc3d884a23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d8f2c4b72a085132face0662767c9e1a019820e5eee1be0024537ac8b673203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wfj55\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:32Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-88dw2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.476912 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad68454a-3350-49a5-9047-8b78e81ec79c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2twqj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qbwlq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.499042 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.499102 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.499123 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.499150 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.499172 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.509158 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0b33fc5-4899-4097-95aa-816630c4707c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:44:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e1344f194ddaf7c75e6255e3fd74d502c80d64bf7bac92627a5b2e8d86f1a4e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8232cdfa5fee36ed2d35e542116dd2399af7372ffeb8425211614ac52f2cfba7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fba77e69abe0740299f69e2f75b439273905784e9846938d4cc55dc1442858e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fbac5bb24f73cb6d18b969ee2be4a35fef632b461508cbb92efb0f55457a8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae84c5229e9bdd10380885ed61746cff6ef9e4236fb3d13b91bcab0977a95f00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f38aca5e4543b1918f07983a49f07938beac82d64b4d5e0ce41abdfb4543742\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:44:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:44:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63e5d02052996b6003678a349dab4bac570d1297b0a2b4272826fbfc88c29248\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529bd2353c8e94c9830aa5f2305384a4000723a314a50dc9404c1d52f6cb5526\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-19T08:45:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-19T08:45:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:44:58Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.534127 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa3ae49eae6ba8ccc8ea230673729236502f96b27faa1ba287847ad255995090\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.549030 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-rfl2j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3d9c112-0fa4-4ad8-84f9-0eb7dd4d92f6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3eee3f507e67219e125f2ee8879f1634e04ecfaf4fec4b01af5de7ae026e747b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkp2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-rfl2j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.566333 4788 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c07881f-4511-4cd1-9283-6891826b57a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-19T08:45:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0318c20767f068f2d2b21c836925e8a5762fe90a69326be8dbceca676c3814b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-19T08:45:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zk8cn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-19T08:45:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-tftzx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-19T08:46:20Z is after 2025-08-24T17:21:41Z" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.602170 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.602228 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.602278 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.602311 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.602334 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.705244 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.705338 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.705356 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.705383 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.705401 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.712705 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 23:05:48.527975864 +0000 UTC Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.713982 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.714050 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.714163 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:20 crc kubenswrapper[4788]: E0219 08:46:20.714164 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:20 crc kubenswrapper[4788]: E0219 08:46:20.714356 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:20 crc kubenswrapper[4788]: E0219 08:46:20.714481 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.808586 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.808653 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.808669 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.808693 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.808711 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.912415 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.912485 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.912503 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.912532 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:20 crc kubenswrapper[4788]: I0219 08:46:20.912551 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:20Z","lastTransitionTime":"2026-02-19T08:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.016069 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.016139 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.016168 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.016195 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.016212 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.120111 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.120172 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.120193 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.120227 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.120290 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.199608 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/3.log" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.223139 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.223202 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.223223 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.223275 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.223294 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.327037 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.327083 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.327097 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.327116 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.327127 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.431048 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.431137 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.431169 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.431203 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.431230 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.533420 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.533655 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.533727 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.533798 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.533858 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.636971 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.637026 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.637045 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.637072 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.637093 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.713882 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 10:24:41.626327291 +0000 UTC Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.714568 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:21 crc kubenswrapper[4788]: E0219 08:46:21.714784 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.740727 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.741320 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.741458 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.741693 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.741827 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.844990 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.845053 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.845070 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.845096 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.845112 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.948349 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.948397 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.948408 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.948429 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:21 crc kubenswrapper[4788]: I0219 08:46:21.948444 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:21Z","lastTransitionTime":"2026-02-19T08:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.051937 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.052011 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.052030 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.052054 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.052072 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.155579 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.155644 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.155660 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.155686 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.155703 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.259659 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.259720 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.259736 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.259763 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.259780 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.362167 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.362587 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.362734 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.362901 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.363038 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.465844 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.465899 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.465915 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.465942 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.465965 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.569094 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.569460 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.569653 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.569903 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.570058 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.575777 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.575917 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.575896801 +0000 UTC m=+148.563908273 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.576086 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.576137 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.576157 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.576184 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576288 4788 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576322 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.576315682 +0000 UTC m=+148.564327144 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576416 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576467 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576484 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576490 4788 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576497 4788 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576512 4788 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576534 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.576527978 +0000 UTC m=+148.564539440 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576574 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.576546439 +0000 UTC m=+148.564558011 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576427 4788 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.576651 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.576633491 +0000 UTC m=+148.564645133 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.673325 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.673405 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.673460 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.673497 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.673525 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.713893 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.713987 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.714119 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.714092 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 11:07:23.603429212 +0000 UTC Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.714241 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.714467 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:22 crc kubenswrapper[4788]: E0219 08:46:22.714653 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.777462 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.777498 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.777525 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.777540 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.777552 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.880581 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.880651 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.880673 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.880702 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.880722 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.983786 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.983866 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.983883 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.983907 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:22 crc kubenswrapper[4788]: I0219 08:46:22.983925 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:22Z","lastTransitionTime":"2026-02-19T08:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.087111 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.087189 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.087211 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.087238 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.087287 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.190500 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.190568 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.190587 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.190613 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.190633 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.293675 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.293738 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.293758 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.293784 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.293803 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.396225 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.396319 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.396339 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.396366 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.396388 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.499085 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.499151 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.499175 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.499207 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.499229 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.602302 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.602438 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.602471 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.602550 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.602576 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.705630 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.705687 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.705699 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.705717 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.705731 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.714028 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:23 crc kubenswrapper[4788]: E0219 08:46:23.714160 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.714296 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 18:37:57.382105158 +0000 UTC Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.810150 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.810219 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.810238 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.810317 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.810351 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.913665 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.913726 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.913746 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.913772 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:23 crc kubenswrapper[4788]: I0219 08:46:23.913789 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:23Z","lastTransitionTime":"2026-02-19T08:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.016823 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.016872 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.016889 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.016912 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.016929 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.119382 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.119462 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.119486 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.119520 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.119541 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.222353 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.222415 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.222433 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.222463 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.222482 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.326442 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.326524 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.326548 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.326580 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.326604 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.429403 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.429462 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.429483 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.429509 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.429528 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.538274 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.538332 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.538344 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.538363 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.538376 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.641506 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.641564 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.641581 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.641601 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.641615 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.713935 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:24 crc kubenswrapper[4788]: E0219 08:46:24.714365 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.714431 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 00:24:26.973452471 +0000 UTC Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.714453 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.714486 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:24 crc kubenswrapper[4788]: E0219 08:46:24.714858 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:24 crc kubenswrapper[4788]: E0219 08:46:24.714968 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.745053 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.745370 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.745535 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.745733 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.745900 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.850642 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.850729 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.850752 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.850783 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.850815 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.954716 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.955179 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.955467 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.955667 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:24 crc kubenswrapper[4788]: I0219 08:46:24.955839 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:24Z","lastTransitionTime":"2026-02-19T08:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.059340 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.059407 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.059425 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.059453 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.059472 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.162648 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.162890 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.162982 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.163080 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.163179 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.266004 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.266072 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.266089 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.266115 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.266133 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.369235 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.369314 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.369328 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.369350 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.369363 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.472844 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.472919 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.472937 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.472965 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.472986 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.576423 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.576499 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.576518 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.576543 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.576562 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.679083 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.679175 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.679204 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.679240 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.679304 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.713553 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:25 crc kubenswrapper[4788]: E0219 08:46:25.713858 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.714719 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 05:33:33.051803989 +0000 UTC Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.782937 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.782989 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.783005 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.783027 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.783045 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.885791 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.886364 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.886384 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.886412 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.886433 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.989113 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.989154 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.989166 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.989185 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:25 crc kubenswrapper[4788]: I0219 08:46:25.989200 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:25Z","lastTransitionTime":"2026-02-19T08:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.094573 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.094643 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.094666 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.094690 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.094762 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.197487 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.197555 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.197572 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.197599 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.197618 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.300961 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.301032 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.301056 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.301092 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.301119 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.404050 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.404128 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.404159 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.404190 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.404212 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.507133 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.507197 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.507220 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.507292 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.507317 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.610603 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.610672 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.610693 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.610720 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.610738 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.713549 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.713573 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.713732 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: E0219 08:46:26.713780 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.713802 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.713851 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.713880 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.713902 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.713580 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:26 crc kubenswrapper[4788]: E0219 08:46:26.714016 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:26 crc kubenswrapper[4788]: E0219 08:46:26.714208 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.714923 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 15:40:00.500127608 +0000 UTC Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.816578 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.816681 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.816701 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.816723 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.816741 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.889510 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.889559 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.889577 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.889598 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.889615 4788 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-19T08:46:26Z","lastTransitionTime":"2026-02-19T08:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.970315 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb"] Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.970972 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.974111 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.974205 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.974416 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 08:46:26 crc kubenswrapper[4788]: I0219 08:46:26.974117 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.019406 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=69.019339758 podStartE2EDuration="1m9.019339758s" podCreationTimestamp="2026-02-19 08:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.018703051 +0000 UTC m=+89.006714583" watchObservedRunningTime="2026-02-19 08:46:27.019339758 +0000 UTC m=+89.007351270" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.030365 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/471aa962-9834-49ac-9bdd-2fa7de494d7f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.030453 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/471aa962-9834-49ac-9bdd-2fa7de494d7f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.030721 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/471aa962-9834-49ac-9bdd-2fa7de494d7f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.030825 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/471aa962-9834-49ac-9bdd-2fa7de494d7f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.031216 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/471aa962-9834-49ac-9bdd-2fa7de494d7f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.067012 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rfl2j" podStartSLOduration=69.066971117 podStartE2EDuration="1m9.066971117s" podCreationTimestamp="2026-02-19 08:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.06634814 +0000 UTC m=+89.054359662" watchObservedRunningTime="2026-02-19 08:46:27.066971117 +0000 UTC m=+89.054982639" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.085424 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podStartSLOduration=67.085385583 podStartE2EDuration="1m7.085385583s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.084419066 +0000 UTC m=+89.072430558" watchObservedRunningTime="2026-02-19 08:46:27.085385583 +0000 UTC m=+89.073397095" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.123183 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.12314811 podStartE2EDuration="1m9.12314811s" podCreationTimestamp="2026-02-19 08:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.122704468 +0000 UTC m=+89.110715950" watchObservedRunningTime="2026-02-19 08:46:27.12314811 +0000 UTC m=+89.111159622" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.131972 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/471aa962-9834-49ac-9bdd-2fa7de494d7f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.132037 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/471aa962-9834-49ac-9bdd-2fa7de494d7f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.132126 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/471aa962-9834-49ac-9bdd-2fa7de494d7f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.132165 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/471aa962-9834-49ac-9bdd-2fa7de494d7f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.132197 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/471aa962-9834-49ac-9bdd-2fa7de494d7f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.132324 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/471aa962-9834-49ac-9bdd-2fa7de494d7f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.132391 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/471aa962-9834-49ac-9bdd-2fa7de494d7f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.133456 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/471aa962-9834-49ac-9bdd-2fa7de494d7f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.147618 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/471aa962-9834-49ac-9bdd-2fa7de494d7f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.153613 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=63.153585066 podStartE2EDuration="1m3.153585066s" podCreationTimestamp="2026-02-19 08:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.153425902 +0000 UTC m=+89.141437394" watchObservedRunningTime="2026-02-19 08:46:27.153585066 +0000 UTC m=+89.141596578" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.161639 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/471aa962-9834-49ac-9bdd-2fa7de494d7f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-8tngb\" (UID: \"471aa962-9834-49ac-9bdd-2fa7de494d7f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.196744 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6lplm" podStartSLOduration=67.196707471 podStartE2EDuration="1m7.196707471s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.196359032 +0000 UTC m=+89.184370544" watchObservedRunningTime="2026-02-19 08:46:27.196707471 +0000 UTC m=+89.184718973" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.213322 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=37.213295817 podStartE2EDuration="37.213295817s" podCreationTimestamp="2026-02-19 08:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.213270306 +0000 UTC m=+89.201281788" watchObservedRunningTime="2026-02-19 08:46:27.213295817 +0000 UTC m=+89.201307329" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.279466 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9hxf6" podStartSLOduration=67.279438614 podStartE2EDuration="1m7.279438614s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.277990554 +0000 UTC m=+89.266002016" watchObservedRunningTime="2026-02-19 08:46:27.279438614 +0000 UTC m=+89.267450126" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.293993 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.302817 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7s4rp" podStartSLOduration=67.302789996 podStartE2EDuration="1m7.302789996s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.300632186 +0000 UTC m=+89.288643658" watchObservedRunningTime="2026-02-19 08:46:27.302789996 +0000 UTC m=+89.290801498" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.368971 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-88dw2" podStartSLOduration=67.368946813 podStartE2EDuration="1m7.368946813s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:27.356541192 +0000 UTC m=+89.344552664" watchObservedRunningTime="2026-02-19 08:46:27.368946813 +0000 UTC m=+89.356958275" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.714183 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:27 crc kubenswrapper[4788]: E0219 08:46:27.714428 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.715180 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 03:01:38.50557188 +0000 UTC Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.715290 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 19 08:46:27 crc kubenswrapper[4788]: I0219 08:46:27.726010 4788 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 19 08:46:28 crc kubenswrapper[4788]: I0219 08:46:28.230363 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" event={"ID":"471aa962-9834-49ac-9bdd-2fa7de494d7f","Type":"ContainerStarted","Data":"b31f77283ba06ab336e8908289b29b1b30e1daac909cb8c0e4399b3ad7dbb2bd"} Feb 19 08:46:28 crc kubenswrapper[4788]: I0219 08:46:28.230454 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" event={"ID":"471aa962-9834-49ac-9bdd-2fa7de494d7f","Type":"ContainerStarted","Data":"350ea9be5fe9734209907ea94f519af67fd0e494b8bd242afe5f7752445ecc06"} Feb 19 08:46:28 crc kubenswrapper[4788]: I0219 08:46:28.251938 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-8tngb" podStartSLOduration=68.251911831 podStartE2EDuration="1m8.251911831s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:28.251034877 +0000 UTC m=+90.239046379" watchObservedRunningTime="2026-02-19 08:46:28.251911831 +0000 UTC m=+90.239923343" Feb 19 08:46:28 crc kubenswrapper[4788]: I0219 08:46:28.713499 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:28 crc kubenswrapper[4788]: I0219 08:46:28.713542 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:28 crc kubenswrapper[4788]: E0219 08:46:28.714925 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:28 crc kubenswrapper[4788]: I0219 08:46:28.715041 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:28 crc kubenswrapper[4788]: E0219 08:46:28.715322 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:28 crc kubenswrapper[4788]: E0219 08:46:28.716334 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:29 crc kubenswrapper[4788]: I0219 08:46:29.713959 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:29 crc kubenswrapper[4788]: E0219 08:46:29.714220 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:30 crc kubenswrapper[4788]: I0219 08:46:30.714291 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:30 crc kubenswrapper[4788]: I0219 08:46:30.714318 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:30 crc kubenswrapper[4788]: I0219 08:46:30.714523 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:30 crc kubenswrapper[4788]: E0219 08:46:30.714704 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:30 crc kubenswrapper[4788]: E0219 08:46:30.714887 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:30 crc kubenswrapper[4788]: E0219 08:46:30.714998 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:31 crc kubenswrapper[4788]: I0219 08:46:31.719270 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:31 crc kubenswrapper[4788]: E0219 08:46:31.721842 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:32 crc kubenswrapper[4788]: I0219 08:46:32.713876 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:32 crc kubenswrapper[4788]: E0219 08:46:32.714127 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:32 crc kubenswrapper[4788]: I0219 08:46:32.714574 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:32 crc kubenswrapper[4788]: E0219 08:46:32.714740 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:32 crc kubenswrapper[4788]: I0219 08:46:32.715761 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:32 crc kubenswrapper[4788]: E0219 08:46:32.715876 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:32 crc kubenswrapper[4788]: I0219 08:46:32.730769 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 19 08:46:33 crc kubenswrapper[4788]: I0219 08:46:33.713644 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:33 crc kubenswrapper[4788]: E0219 08:46:33.713857 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:33 crc kubenswrapper[4788]: I0219 08:46:33.715148 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:46:33 crc kubenswrapper[4788]: E0219 08:46:33.715533 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" Feb 19 08:46:33 crc kubenswrapper[4788]: I0219 08:46:33.731046 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.73102348 podStartE2EDuration="1.73102348s" podCreationTimestamp="2026-02-19 08:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:46:33.730543966 +0000 UTC m=+95.718555478" watchObservedRunningTime="2026-02-19 08:46:33.73102348 +0000 UTC m=+95.719034962" Feb 19 08:46:34 crc kubenswrapper[4788]: I0219 08:46:34.714147 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:34 crc kubenswrapper[4788]: I0219 08:46:34.714202 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:34 crc kubenswrapper[4788]: E0219 08:46:34.714877 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:34 crc kubenswrapper[4788]: E0219 08:46:34.714939 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:34 crc kubenswrapper[4788]: I0219 08:46:34.715309 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:34 crc kubenswrapper[4788]: E0219 08:46:34.715578 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:35 crc kubenswrapper[4788]: I0219 08:46:35.713652 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:35 crc kubenswrapper[4788]: E0219 08:46:35.713791 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:36 crc kubenswrapper[4788]: I0219 08:46:36.713516 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:36 crc kubenswrapper[4788]: I0219 08:46:36.713560 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:36 crc kubenswrapper[4788]: E0219 08:46:36.713776 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:36 crc kubenswrapper[4788]: I0219 08:46:36.713887 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:36 crc kubenswrapper[4788]: E0219 08:46:36.714010 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:36 crc kubenswrapper[4788]: E0219 08:46:36.714187 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:37 crc kubenswrapper[4788]: I0219 08:46:37.713571 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:37 crc kubenswrapper[4788]: E0219 08:46:37.713761 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:38 crc kubenswrapper[4788]: I0219 08:46:38.171144 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:38 crc kubenswrapper[4788]: E0219 08:46:38.171358 4788 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:46:38 crc kubenswrapper[4788]: E0219 08:46:38.171450 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs podName:ad68454a-3350-49a5-9047-8b78e81ec79c nodeName:}" failed. No retries permitted until 2026-02-19 08:47:42.171424533 +0000 UTC m=+164.159436045 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs") pod "network-metrics-daemon-qbwlq" (UID: "ad68454a-3350-49a5-9047-8b78e81ec79c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 19 08:46:38 crc kubenswrapper[4788]: I0219 08:46:38.713494 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:38 crc kubenswrapper[4788]: I0219 08:46:38.713531 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:38 crc kubenswrapper[4788]: I0219 08:46:38.716074 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:38 crc kubenswrapper[4788]: E0219 08:46:38.716063 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:38 crc kubenswrapper[4788]: E0219 08:46:38.716683 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:38 crc kubenswrapper[4788]: E0219 08:46:38.716967 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:39 crc kubenswrapper[4788]: I0219 08:46:39.713851 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:39 crc kubenswrapper[4788]: E0219 08:46:39.715148 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:40 crc kubenswrapper[4788]: I0219 08:46:40.713834 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:40 crc kubenswrapper[4788]: I0219 08:46:40.713897 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:40 crc kubenswrapper[4788]: E0219 08:46:40.714080 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:40 crc kubenswrapper[4788]: I0219 08:46:40.714123 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:40 crc kubenswrapper[4788]: E0219 08:46:40.714386 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:40 crc kubenswrapper[4788]: E0219 08:46:40.714753 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:41 crc kubenswrapper[4788]: I0219 08:46:41.713966 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:41 crc kubenswrapper[4788]: E0219 08:46:41.714197 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:42 crc kubenswrapper[4788]: I0219 08:46:42.714481 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:42 crc kubenswrapper[4788]: I0219 08:46:42.714553 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:42 crc kubenswrapper[4788]: I0219 08:46:42.714506 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:42 crc kubenswrapper[4788]: E0219 08:46:42.714746 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:42 crc kubenswrapper[4788]: E0219 08:46:42.714926 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:42 crc kubenswrapper[4788]: E0219 08:46:42.715126 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:43 crc kubenswrapper[4788]: I0219 08:46:43.714383 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:43 crc kubenswrapper[4788]: E0219 08:46:43.714601 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:44 crc kubenswrapper[4788]: I0219 08:46:44.713755 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:44 crc kubenswrapper[4788]: I0219 08:46:44.713854 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:44 crc kubenswrapper[4788]: E0219 08:46:44.713970 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:44 crc kubenswrapper[4788]: I0219 08:46:44.714047 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:44 crc kubenswrapper[4788]: E0219 08:46:44.714425 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:44 crc kubenswrapper[4788]: E0219 08:46:44.714585 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:45 crc kubenswrapper[4788]: I0219 08:46:45.713391 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:45 crc kubenswrapper[4788]: E0219 08:46:45.713527 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:46 crc kubenswrapper[4788]: I0219 08:46:46.714095 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:46 crc kubenswrapper[4788]: I0219 08:46:46.714140 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:46 crc kubenswrapper[4788]: I0219 08:46:46.714153 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:46 crc kubenswrapper[4788]: E0219 08:46:46.715582 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:46 crc kubenswrapper[4788]: E0219 08:46:46.715765 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:46 crc kubenswrapper[4788]: E0219 08:46:46.715693 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:47 crc kubenswrapper[4788]: I0219 08:46:47.713584 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:47 crc kubenswrapper[4788]: E0219 08:46:47.713827 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:47 crc kubenswrapper[4788]: I0219 08:46:47.715029 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:46:47 crc kubenswrapper[4788]: E0219 08:46:47.715472 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xmshh_openshift-ovn-kubernetes(fd5c1c46-74a4-41f4-ad05-af438781bd6a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" Feb 19 08:46:48 crc kubenswrapper[4788]: I0219 08:46:48.713414 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:48 crc kubenswrapper[4788]: I0219 08:46:48.713519 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:48 crc kubenswrapper[4788]: E0219 08:46:48.715192 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:48 crc kubenswrapper[4788]: I0219 08:46:48.715221 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:48 crc kubenswrapper[4788]: E0219 08:46:48.715335 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:48 crc kubenswrapper[4788]: E0219 08:46:48.715430 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:49 crc kubenswrapper[4788]: I0219 08:46:49.713563 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:49 crc kubenswrapper[4788]: E0219 08:46:49.713688 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:50 crc kubenswrapper[4788]: I0219 08:46:50.714070 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:50 crc kubenswrapper[4788]: I0219 08:46:50.714081 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:50 crc kubenswrapper[4788]: E0219 08:46:50.714319 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:50 crc kubenswrapper[4788]: I0219 08:46:50.714397 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:50 crc kubenswrapper[4788]: E0219 08:46:50.714505 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:50 crc kubenswrapper[4788]: E0219 08:46:50.714591 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:51 crc kubenswrapper[4788]: I0219 08:46:51.713300 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:51 crc kubenswrapper[4788]: E0219 08:46:51.713460 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:52 crc kubenswrapper[4788]: I0219 08:46:52.714393 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:52 crc kubenswrapper[4788]: I0219 08:46:52.714459 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:52 crc kubenswrapper[4788]: I0219 08:46:52.714609 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:52 crc kubenswrapper[4788]: E0219 08:46:52.714797 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:52 crc kubenswrapper[4788]: E0219 08:46:52.714968 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:52 crc kubenswrapper[4788]: E0219 08:46:52.715187 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:53 crc kubenswrapper[4788]: I0219 08:46:53.714226 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:53 crc kubenswrapper[4788]: E0219 08:46:53.714450 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.337894 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/1.log" Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.338839 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/0.log" Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.338921 4788 generic.go:334] "Generic (PLEG): container finished" podID="a5c26787-29de-439a-86b8-920cac6c8ab8" containerID="13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d" exitCode=1 Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.338972 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hxf6" event={"ID":"a5c26787-29de-439a-86b8-920cac6c8ab8","Type":"ContainerDied","Data":"13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d"} Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.339021 4788 scope.go:117] "RemoveContainer" containerID="217f582fd554787f3e948d6edff735e3f996b5852cd83d661dfea44dea760ec0" Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.339755 4788 scope.go:117] "RemoveContainer" containerID="13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d" Feb 19 08:46:54 crc kubenswrapper[4788]: E0219 08:46:54.340056 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9hxf6_openshift-multus(a5c26787-29de-439a-86b8-920cac6c8ab8)\"" pod="openshift-multus/multus-9hxf6" podUID="a5c26787-29de-439a-86b8-920cac6c8ab8" Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.713855 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.713915 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:54 crc kubenswrapper[4788]: I0219 08:46:54.714072 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:54 crc kubenswrapper[4788]: E0219 08:46:54.714209 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:54 crc kubenswrapper[4788]: E0219 08:46:54.714315 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:54 crc kubenswrapper[4788]: E0219 08:46:54.714445 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:55 crc kubenswrapper[4788]: I0219 08:46:55.343531 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/1.log" Feb 19 08:46:55 crc kubenswrapper[4788]: I0219 08:46:55.714126 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:55 crc kubenswrapper[4788]: E0219 08:46:55.714314 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:56 crc kubenswrapper[4788]: I0219 08:46:56.714046 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:56 crc kubenswrapper[4788]: I0219 08:46:56.714089 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:56 crc kubenswrapper[4788]: I0219 08:46:56.714059 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:56 crc kubenswrapper[4788]: E0219 08:46:56.714241 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:56 crc kubenswrapper[4788]: E0219 08:46:56.714356 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:56 crc kubenswrapper[4788]: E0219 08:46:56.714451 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:57 crc kubenswrapper[4788]: I0219 08:46:57.713978 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:57 crc kubenswrapper[4788]: E0219 08:46:57.714408 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:58 crc kubenswrapper[4788]: E0219 08:46:58.656682 4788 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 19 08:46:58 crc kubenswrapper[4788]: I0219 08:46:58.714595 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:46:58 crc kubenswrapper[4788]: E0219 08:46:58.717130 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:46:58 crc kubenswrapper[4788]: I0219 08:46:58.717219 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:46:58 crc kubenswrapper[4788]: E0219 08:46:58.717441 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:46:58 crc kubenswrapper[4788]: I0219 08:46:58.717277 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:46:58 crc kubenswrapper[4788]: E0219 08:46:58.717610 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:46:58 crc kubenswrapper[4788]: E0219 08:46:58.849587 4788 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 08:46:59 crc kubenswrapper[4788]: I0219 08:46:59.713747 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:46:59 crc kubenswrapper[4788]: E0219 08:46:59.714602 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:46:59 crc kubenswrapper[4788]: I0219 08:46:59.715050 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.377937 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/3.log" Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.381438 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerStarted","Data":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.382927 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.434489 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podStartSLOduration=100.43446262 podStartE2EDuration="1m40.43446262s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:00.432234608 +0000 UTC m=+122.420246100" watchObservedRunningTime="2026-02-19 08:47:00.43446262 +0000 UTC m=+122.422474122" Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.714129 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.714199 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.714144 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:00 crc kubenswrapper[4788]: E0219 08:47:00.714380 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:47:00 crc kubenswrapper[4788]: E0219 08:47:00.714497 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:47:00 crc kubenswrapper[4788]: E0219 08:47:00.714603 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.732176 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qbwlq"] Feb 19 08:47:00 crc kubenswrapper[4788]: I0219 08:47:00.732361 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:00 crc kubenswrapper[4788]: E0219 08:47:00.732511 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:47:02 crc kubenswrapper[4788]: I0219 08:47:02.714086 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:02 crc kubenswrapper[4788]: I0219 08:47:02.714136 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:02 crc kubenswrapper[4788]: E0219 08:47:02.714770 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:47:02 crc kubenswrapper[4788]: I0219 08:47:02.714333 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:02 crc kubenswrapper[4788]: E0219 08:47:02.714897 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:47:02 crc kubenswrapper[4788]: I0219 08:47:02.714194 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:02 crc kubenswrapper[4788]: E0219 08:47:02.715038 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:47:02 crc kubenswrapper[4788]: E0219 08:47:02.715154 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:47:03 crc kubenswrapper[4788]: E0219 08:47:03.851640 4788 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 08:47:04 crc kubenswrapper[4788]: I0219 08:47:04.713809 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:04 crc kubenswrapper[4788]: I0219 08:47:04.713885 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:04 crc kubenswrapper[4788]: I0219 08:47:04.713913 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:04 crc kubenswrapper[4788]: E0219 08:47:04.713973 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:47:04 crc kubenswrapper[4788]: E0219 08:47:04.714083 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:47:04 crc kubenswrapper[4788]: I0219 08:47:04.714102 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:04 crc kubenswrapper[4788]: E0219 08:47:04.714371 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:47:04 crc kubenswrapper[4788]: E0219 08:47:04.714500 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:47:06 crc kubenswrapper[4788]: I0219 08:47:06.714325 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:06 crc kubenswrapper[4788]: I0219 08:47:06.714364 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:06 crc kubenswrapper[4788]: I0219 08:47:06.714406 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:06 crc kubenswrapper[4788]: I0219 08:47:06.714363 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:06 crc kubenswrapper[4788]: E0219 08:47:06.714535 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:47:06 crc kubenswrapper[4788]: E0219 08:47:06.714705 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:47:06 crc kubenswrapper[4788]: E0219 08:47:06.714881 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:47:06 crc kubenswrapper[4788]: E0219 08:47:06.714966 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:47:07 crc kubenswrapper[4788]: I0219 08:47:07.714069 4788 scope.go:117] "RemoveContainer" containerID="13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d" Feb 19 08:47:08 crc kubenswrapper[4788]: I0219 08:47:08.412957 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/1.log" Feb 19 08:47:08 crc kubenswrapper[4788]: I0219 08:47:08.413272 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hxf6" event={"ID":"a5c26787-29de-439a-86b8-920cac6c8ab8","Type":"ContainerStarted","Data":"928812eee9cd494b61b87304c4ab4d58ffd651e9468a83240fec350eb56ec947"} Feb 19 08:47:08 crc kubenswrapper[4788]: I0219 08:47:08.713723 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:08 crc kubenswrapper[4788]: I0219 08:47:08.713850 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:08 crc kubenswrapper[4788]: E0219 08:47:08.715840 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:47:08 crc kubenswrapper[4788]: I0219 08:47:08.715940 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:08 crc kubenswrapper[4788]: E0219 08:47:08.716198 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:47:08 crc kubenswrapper[4788]: E0219 08:47:08.716318 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:47:08 crc kubenswrapper[4788]: I0219 08:47:08.716599 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:08 crc kubenswrapper[4788]: E0219 08:47:08.716958 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:47:08 crc kubenswrapper[4788]: E0219 08:47:08.852603 4788 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 19 08:47:10 crc kubenswrapper[4788]: I0219 08:47:10.713888 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:10 crc kubenswrapper[4788]: E0219 08:47:10.714125 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:47:10 crc kubenswrapper[4788]: I0219 08:47:10.714476 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:10 crc kubenswrapper[4788]: I0219 08:47:10.714575 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:10 crc kubenswrapper[4788]: I0219 08:47:10.714582 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:10 crc kubenswrapper[4788]: E0219 08:47:10.714658 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:47:10 crc kubenswrapper[4788]: E0219 08:47:10.714882 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:47:10 crc kubenswrapper[4788]: E0219 08:47:10.715454 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:47:12 crc kubenswrapper[4788]: I0219 08:47:12.714081 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:12 crc kubenswrapper[4788]: I0219 08:47:12.714080 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:12 crc kubenswrapper[4788]: E0219 08:47:12.714335 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qbwlq" podUID="ad68454a-3350-49a5-9047-8b78e81ec79c" Feb 19 08:47:12 crc kubenswrapper[4788]: I0219 08:47:12.714370 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:12 crc kubenswrapper[4788]: I0219 08:47:12.714405 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:12 crc kubenswrapper[4788]: E0219 08:47:12.714507 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 19 08:47:12 crc kubenswrapper[4788]: E0219 08:47:12.714723 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 19 08:47:12 crc kubenswrapper[4788]: E0219 08:47:12.715110 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.713670 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.713701 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.713823 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.714060 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.716626 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.716732 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.717120 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.717432 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.718854 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 08:47:14 crc kubenswrapper[4788]: I0219 08:47:14.719084 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.816506 4788 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.899022 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qgw8f"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.899781 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9s2zg"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.900347 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.901281 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.903310 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.903881 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.908463 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.908494 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.909604 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.910089 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.910516 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.910891 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.912308 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.912535 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.913730 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.913992 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.914229 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.914456 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t7bpz"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.915076 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.915739 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.916731 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.916951 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.917002 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.917386 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.918212 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.918301 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.921388 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.922433 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.925632 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.925756 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.927017 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.928903 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.931020 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.933640 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.933791 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.934026 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.933650 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.935482 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.935719 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.935890 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.936119 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.936390 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.936592 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.936724 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.936850 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.937000 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.937142 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.937317 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.937336 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.937510 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.937771 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.937863 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.937794 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.938074 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.938150 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.938532 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.938563 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.938856 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.939054 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.939317 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-x5fpc"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.939859 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.941264 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.941629 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.941832 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.941982 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.942114 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.942422 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.942702 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.944650 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-h7qcn"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.945092 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-h7qcn" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.946058 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.947126 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.947325 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.947613 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.947812 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.948035 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.948326 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.948809 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.949093 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.949877 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.950741 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.954333 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.955197 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.955433 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8vngm"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.955208 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.955275 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.956199 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.958407 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xw477"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.959099 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.959150 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jr6pt"] Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.975614 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:17 crc kubenswrapper[4788]: I0219 08:47:17.981506 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.002518 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.002707 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.006858 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.007101 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.007272 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.007397 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.007514 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.007637 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.008281 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.008508 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.008663 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.009107 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.009149 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.009294 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.009398 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.010674 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.010749 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.010863 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.013078 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.013476 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zhsg9"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.014123 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pj5df"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.014571 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5cmw7"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.014991 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.015284 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.015372 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.015590 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.016232 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-brvcp"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.016590 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.016616 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.016955 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.017405 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.017647 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.017945 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.018512 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.018818 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.019084 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.022913 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.029433 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.030370 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.030602 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.030710 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.031410 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.031562 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-8tvwh"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.031763 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.032086 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.032237 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.033403 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.033543 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.033568 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.033646 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.033719 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.033754 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.034186 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.034643 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.034843 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.038627 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n2kj2"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.041559 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.041635 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.049539 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.050462 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.056257 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.060048 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.062954 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.063809 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.065227 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.065783 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.072376 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.072972 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.074373 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.074962 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075360 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075725 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-serving-cert\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075753 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-client\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075791 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-config\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075816 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-ca\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075836 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-client-ca\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075854 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075871 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfmr8\" (UniqueName: \"kubernetes.io/projected/26d9a486-1abe-4d18-8b80-723c8d25ef89-kube-api-access-xfmr8\") pod \"cluster-samples-operator-665b6dd947-jmrsd\" (UID: \"26d9a486-1abe-4d18-8b80-723c8d25ef89\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075886 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075901 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075917 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d49ed318-47b5-4101-b4b9-09dda3667dd3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075931 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-serving-cert\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075946 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9945x\" (UniqueName: \"kubernetes.io/projected/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-kube-api-access-9945x\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075963 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96rxm\" (UniqueName: \"kubernetes.io/projected/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-kube-api-access-96rxm\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075979 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9w9z\" (UniqueName: \"kubernetes.io/projected/104f8fd7-4fdc-4e3b-8028-30c2630091b6-kube-api-access-g9w9z\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.075995 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-config\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076010 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-service-ca\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076031 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmkqr\" (UniqueName: \"kubernetes.io/projected/6e569289-11c5-4577-92e6-c8031eda90ec-kube-api-access-cmkqr\") pod \"dns-operator-744455d44c-zhsg9\" (UID: \"6e569289-11c5-4577-92e6-c8031eda90ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076046 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-oauth-config\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076061 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-oauth-serving-cert\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076074 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-config\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076090 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrxl\" (UniqueName: \"kubernetes.io/projected/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-kube-api-access-xbrxl\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076112 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e569289-11c5-4577-92e6-c8031eda90ec-metrics-tls\") pod \"dns-operator-744455d44c-zhsg9\" (UID: \"6e569289-11c5-4577-92e6-c8031eda90ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076127 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-config\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076152 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mtpx\" (UniqueName: \"kubernetes.io/projected/d49ed318-47b5-4101-b4b9-09dda3667dd3-kube-api-access-9mtpx\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076174 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-service-ca\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076193 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076208 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d9a486-1abe-4d18-8b80-723c8d25ef89-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jmrsd\" (UID: \"26d9a486-1abe-4d18-8b80-723c8d25ef89\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076226 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-684xx\" (UniqueName: \"kubernetes.io/projected/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-kube-api-access-684xx\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076255 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-auth-proxy-config\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076272 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-machine-approver-tls\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076291 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d49ed318-47b5-4101-b4b9-09dda3667dd3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076338 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076358 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d49ed318-47b5-4101-b4b9-09dda3667dd3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076380 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rc2\" (UniqueName: \"kubernetes.io/projected/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-kube-api-access-r4rc2\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076396 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/104f8fd7-4fdc-4e3b-8028-30c2630091b6-serving-cert\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076411 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-trusted-ca-bundle\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.076584 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.077033 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.078000 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9s2zg"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.084721 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.085218 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jdjtx"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.085597 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.085619 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.085955 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.086601 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.087052 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.087542 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.090292 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.090352 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.090915 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.098047 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.098382 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-s6bkf"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.098896 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.100521 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t7bpz"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.102004 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.102620 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.106005 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.106669 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.107030 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.107977 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.108839 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.111110 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.111298 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-h7qcn"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.111319 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9c2k4"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.112614 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qgw8f"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.112677 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.121105 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.122605 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-x5fpc"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.124224 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.124353 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5cmw7"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.127347 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.127373 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xw477"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.127384 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.129801 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.131540 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jr6pt"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.132810 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zhsg9"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.138220 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.138273 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-brvcp"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.141550 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.144205 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.146601 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pj5df"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.149555 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.159961 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.161436 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-s6bkf"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.166467 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.167855 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.168874 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n2kj2"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.171287 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.172573 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z2jtl"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.173211 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z2jtl" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.175087 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jdjtx"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.176053 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hbn8p"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.176886 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-684xx\" (UniqueName: \"kubernetes.io/projected/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-kube-api-access-684xx\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.176917 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-auth-proxy-config\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.176936 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-machine-approver-tls\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.176943 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.176952 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d49ed318-47b5-4101-b4b9-09dda3667dd3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.176969 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.176988 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d49ed318-47b5-4101-b4b9-09dda3667dd3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177010 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rc2\" (UniqueName: \"kubernetes.io/projected/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-kube-api-access-r4rc2\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177025 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/104f8fd7-4fdc-4e3b-8028-30c2630091b6-serving-cert\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177041 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-trusted-ca-bundle\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177057 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-serving-cert\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177073 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-client\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177095 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-config\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177113 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfmr8\" (UniqueName: \"kubernetes.io/projected/26d9a486-1abe-4d18-8b80-723c8d25ef89-kube-api-access-xfmr8\") pod \"cluster-samples-operator-665b6dd947-jmrsd\" (UID: \"26d9a486-1abe-4d18-8b80-723c8d25ef89\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177129 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-ca\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177146 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-client-ca\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177162 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177179 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177197 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177213 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d49ed318-47b5-4101-b4b9-09dda3667dd3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177233 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-serving-cert\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177269 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9945x\" (UniqueName: \"kubernetes.io/projected/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-kube-api-access-9945x\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177287 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9w9z\" (UniqueName: \"kubernetes.io/projected/104f8fd7-4fdc-4e3b-8028-30c2630091b6-kube-api-access-g9w9z\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177304 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96rxm\" (UniqueName: \"kubernetes.io/projected/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-kube-api-access-96rxm\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177321 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-config\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177340 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-service-ca\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177361 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmkqr\" (UniqueName: \"kubernetes.io/projected/6e569289-11c5-4577-92e6-c8031eda90ec-kube-api-access-cmkqr\") pod \"dns-operator-744455d44c-zhsg9\" (UID: \"6e569289-11c5-4577-92e6-c8031eda90ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177376 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-oauth-config\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177391 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-oauth-serving-cert\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177408 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-config\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177425 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e569289-11c5-4577-92e6-c8031eda90ec-metrics-tls\") pod \"dns-operator-744455d44c-zhsg9\" (UID: \"6e569289-11c5-4577-92e6-c8031eda90ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177442 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrxl\" (UniqueName: \"kubernetes.io/projected/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-kube-api-access-xbrxl\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177458 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-config\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177480 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mtpx\" (UniqueName: \"kubernetes.io/projected/d49ed318-47b5-4101-b4b9-09dda3667dd3-kube-api-access-9mtpx\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177502 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-service-ca\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177518 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.177535 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d9a486-1abe-4d18-8b80-723c8d25ef89-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jmrsd\" (UID: \"26d9a486-1abe-4d18-8b80-723c8d25ef89\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.179218 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-trusted-ca-bundle\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.179388 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-oauth-serving-cert\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.179390 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-config\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.179636 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-auth-proxy-config\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.179901 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-config\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.180567 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d49ed318-47b5-4101-b4b9-09dda3667dd3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.182238 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-client-ca\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.182494 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e569289-11c5-4577-92e6-c8031eda90ec-metrics-tls\") pod \"dns-operator-744455d44c-zhsg9\" (UID: \"6e569289-11c5-4577-92e6-c8031eda90ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.182838 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d9a486-1abe-4d18-8b80-723c8d25ef89-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jmrsd\" (UID: \"26d9a486-1abe-4d18-8b80-723c8d25ef89\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.183090 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.183420 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-machine-approver-tls\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.183920 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.183966 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8vngm"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.183993 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184056 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184087 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184113 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-client\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184126 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-config\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184472 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-config\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184642 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-service-ca\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184667 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-service-ca\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184672 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.184713 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-serving-cert\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.185058 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/104f8fd7-4fdc-4e3b-8028-30c2630091b6-serving-cert\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.185383 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.185762 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.186177 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/104f8fd7-4fdc-4e3b-8028-30c2630091b6-etcd-ca\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.186176 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-oauth-config\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.186893 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d49ed318-47b5-4101-b4b9-09dda3667dd3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.187104 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-serving-cert\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.187220 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.188410 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.189616 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.190638 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.191796 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.192848 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9c2k4"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.193957 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z2jtl"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.195151 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hbn8p"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.196315 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-n6shm"] Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.197007 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.207866 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.222703 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.243511 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.275494 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.285118 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.303811 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.328972 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.343643 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.363923 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.384105 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.403968 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.423681 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.443828 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.464451 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.484071 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.503855 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.523632 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.544158 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.564107 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.584430 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.604647 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.624323 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.646569 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.664201 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.684162 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.703950 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.723827 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.744328 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.763843 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.785068 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.803923 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.823914 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.844090 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.864079 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.884308 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.904757 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.924108 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.944705 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 08:47:18 crc kubenswrapper[4788]: I0219 08:47:18.987830 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.004414 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.024689 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.052656 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.064446 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.082850 4788 request.go:700] Waited for 1.007679099s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.084839 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.104458 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.125125 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.143968 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.164011 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.184449 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.204561 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.237533 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.244657 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.264667 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.284074 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.303617 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.323770 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.343953 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.385188 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.392602 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.392789 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-serving-cert\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.392897 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89374177-14ef-4b9a-938a-a838d6d0aab1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.393007 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-encryption-config\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.393100 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-encryption-config\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.393416 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.393545 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-bound-sa-token\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.393760 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.393946 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-serving-cert\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.394052 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.394301 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhck\" (UniqueName: \"kubernetes.io/projected/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-kube-api-access-rdhck\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.394486 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdn9\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-kube-api-access-lkdn9\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.394782 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-etcd-client\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.394903 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/039e68bf-13c6-484b-a4db-229d6d6b5886-audit-dir\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.394936 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:19.894913929 +0000 UTC m=+141.882925441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395000 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-989lg\" (UniqueName: \"kubernetes.io/projected/ec69babc-944a-4707-914f-5f1da38d6316-kube-api-access-989lg\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395122 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec69babc-944a-4707-914f-5f1da38d6316-serving-cert\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395319 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-service-ca-bundle\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395432 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89374177-14ef-4b9a-938a-a838d6d0aab1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395541 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6fb576-402e-4972-aa09-089865ce389b-serving-cert\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395650 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29sf\" (UniqueName: \"kubernetes.io/projected/9f6fb576-402e-4972-aa09-089865ce389b-kube-api-access-j29sf\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395710 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq6f9\" (UniqueName: \"kubernetes.io/projected/b590f5cb-d3a6-43b9-97ef-29ad515ecbc9-kube-api-access-hq6f9\") pod \"downloads-7954f5f757-h7qcn\" (UID: \"b590f5cb-d3a6-43b9-97ef-29ad515ecbc9\") " pod="openshift-console/downloads-7954f5f757-h7qcn" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395831 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-audit-dir\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395889 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395936 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-policies\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.395977 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396120 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs75w\" (UniqueName: \"kubernetes.io/projected/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-kube-api-access-xs75w\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396169 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-config\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396218 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f6fb576-402e-4972-aa09-089865ce389b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396295 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-audit-policies\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396339 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-dir\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396382 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396639 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396801 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-audit\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.396977 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l5hg\" (UniqueName: \"kubernetes.io/projected/49e1dd56-37f0-41b8-8afa-d040c5750fac-kube-api-access-5l5hg\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.397399 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-serving-cert\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.398422 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.398507 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-etcd-serving-ca\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.398552 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.398718 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.398883 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.399074 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/039e68bf-13c6-484b-a4db-229d6d6b5886-node-pullsecrets\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.399163 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.399839 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d865\" (UniqueName: \"kubernetes.io/projected/2ecb616f-62fc-4ff2-a353-6e08c63581a8-kube-api-access-8d865\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.399949 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-client-ca\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.400100 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-trusted-ca\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.400345 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ecb616f-62fc-4ff2-a353-6e08c63581a8-config\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.400537 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfnt\" (UniqueName: \"kubernetes.io/projected/039e68bf-13c6-484b-a4db-229d6d6b5886-kube-api-access-nkfnt\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.400653 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-etcd-client\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.400824 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-tls\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.400948 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-certificates\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.401048 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-config\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.401486 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2ecb616f-62fc-4ff2-a353-6e08c63581a8-images\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.401688 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-config\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.401798 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.401905 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.402011 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ecb616f-62fc-4ff2-a353-6e08c63581a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.402128 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-image-import-ca\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.402231 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.404287 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.426633 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.444873 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.464689 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.484239 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.503569 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.503740 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.503782 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b012062-484f-4ad8-99c5-37425cb6e3e1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.503852 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-etcd-client\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.503890 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-certificates\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.503928 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ccbb120-c076-426f-b24b-fe0530b3e056-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.503967 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-config\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504010 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2ecb616f-62fc-4ff2-a353-6e08c63581a8-images\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504061 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxkqb\" (UniqueName: \"kubernetes.io/projected/ec358d29-e3c8-4f69-a1bc-7879193b026a-kube-api-access-kxkqb\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504109 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ecb616f-62fc-4ff2-a353-6e08c63581a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504156 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b37fe460-3670-48fc-8eaf-d0bbf8b26557-serving-cert\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504208 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-image-import-ca\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504242 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/192b6105-6538-462c-8b1c-3a1a69cea50d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504346 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5a3854ab-c63a-4697-975c-e049cc23e0d3-srv-cert\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504377 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pqh\" (UniqueName: \"kubernetes.io/projected/7ea42bdd-b321-4b32-91d3-0e1cb559ace7-kube-api-access-25pqh\") pod \"multus-admission-controller-857f4d67dd-n2kj2\" (UID: \"7ea42bdd-b321-4b32-91d3-0e1cb559ace7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504411 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89374177-14ef-4b9a-938a-a838d6d0aab1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504447 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-encryption-config\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504478 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-encryption-config\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504523 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlwk2\" (UniqueName: \"kubernetes.io/projected/e285d56c-2894-48f6-987d-217b4efd8f6e-kube-api-access-jlwk2\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504556 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8z8z\" (UniqueName: \"kubernetes.io/projected/37572f7a-2fcf-4d28-993d-cd924c0a78b8-kube-api-access-h8z8z\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng2p7\" (UID: \"37572f7a-2fcf-4d28-993d-cd924c0a78b8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504588 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/85f68684-96fb-43fd-bdd0-384451fb1a58-srv-cert\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504617 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66grx\" (UniqueName: \"kubernetes.io/projected/7e1aeeee-05bb-4755-bc17-7cb87025639a-kube-api-access-66grx\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504648 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/37572f7a-2fcf-4d28-993d-cd924c0a78b8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng2p7\" (UID: \"37572f7a-2fcf-4d28-993d-cd924c0a78b8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504680 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac75663-d01c-4122-b30a-faf65d9a063a-serving-cert\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504715 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-serving-cert\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504748 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fdbp\" (UniqueName: \"kubernetes.io/projected/85f68684-96fb-43fd-bdd0-384451fb1a58-kube-api-access-8fdbp\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504780 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6826dee1-4dec-4b7c-88a1-600eb014574c-secret-volume\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504816 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-mountpoint-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504866 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504904 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/55518a10-e84e-48fa-bc48-fb357a94f6ea-signing-cabundle\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.504970 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8422f1bf-0f17-43df-8425-93d34601dedd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505061 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-etcd-client\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505110 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-989lg\" (UniqueName: \"kubernetes.io/projected/ec69babc-944a-4707-914f-5f1da38d6316-kube-api-access-989lg\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505187 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-metrics-tls\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505288 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdn9\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-kube-api-access-lkdn9\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505338 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e285d56c-2894-48f6-987d-217b4efd8f6e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505403 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec69babc-944a-4707-914f-5f1da38d6316-serving-cert\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505468 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac75663-d01c-4122-b30a-faf65d9a063a-config\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505510 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-audit-dir\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505542 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhj66\" (UniqueName: \"kubernetes.io/projected/a3eda937-2ad6-4192-bd7f-c04b1697636e-kube-api-access-mhj66\") pod \"ingress-canary-z2jtl\" (UID: \"a3eda937-2ad6-4192-bd7f-c04b1697636e\") " pod="openshift-ingress-canary/ingress-canary-z2jtl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505581 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505612 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrz4d\" (UniqueName: \"kubernetes.io/projected/5a3854ab-c63a-4697-975c-e049cc23e0d3-kube-api-access-xrz4d\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505647 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7lw4\" (UniqueName: \"kubernetes.io/projected/f4b40324-2628-4351-a48f-c607b5a19114-kube-api-access-w7lw4\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505711 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/55518a10-e84e-48fa-bc48-fb357a94f6ea-signing-key\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505739 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-plugins-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505768 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-default-certificate\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505823 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-audit\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505853 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-audit-policies\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505885 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpzl\" (UniqueName: \"kubernetes.io/projected/aef03b84-8d54-4cff-a236-20478d45a45d-kube-api-access-rlpzl\") pod \"migrator-59844c95c7-nfw9m\" (UID: \"aef03b84-8d54-4cff-a236-20478d45a45d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505918 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8422f1bf-0f17-43df-8425-93d34601dedd-config\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505950 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jh7r\" (UniqueName: \"kubernetes.io/projected/55518a10-e84e-48fa-bc48-fb357a94f6ea-kube-api-access-8jh7r\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.505983 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4b40324-2628-4351-a48f-c607b5a19114-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506016 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-etcd-serving-ca\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506052 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506136 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjfgr\" (UniqueName: \"kubernetes.io/projected/6826dee1-4dec-4b7c-88a1-600eb014574c-kube-api-access-bjfgr\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506190 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbfkd\" (UniqueName: \"kubernetes.io/projected/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-kube-api-access-bbfkd\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506221 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37fe460-3670-48fc-8eaf-d0bbf8b26557-config\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506281 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx6p5\" (UniqueName: \"kubernetes.io/projected/9192ab47-cd4b-4d49-916f-e1454d517b52-kube-api-access-dx6p5\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506317 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/039e68bf-13c6-484b-a4db-229d6d6b5886-node-pullsecrets\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506349 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506382 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d865\" (UniqueName: \"kubernetes.io/projected/2ecb616f-62fc-4ff2-a353-6e08c63581a8-kube-api-access-8d865\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506417 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b012062-484f-4ad8-99c5-37425cb6e3e1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506449 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljg8c\" (UniqueName: \"kubernetes.io/projected/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-kube-api-access-ljg8c\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506481 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9192ab47-cd4b-4d49-916f-e1454d517b52-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506512 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/85f68684-96fb-43fd-bdd0-384451fb1a58-profile-collector-cert\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506548 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-client-ca\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506577 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5a3854ab-c63a-4697-975c-e049cc23e0d3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506609 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ccbb120-c076-426f-b24b-fe0530b3e056-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506639 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2nl\" (UniqueName: \"kubernetes.io/projected/b37fe460-3670-48fc-8eaf-d0bbf8b26557-kube-api-access-ws2nl\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506669 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9192ab47-cd4b-4d49-916f-e1454d517b52-proxy-tls\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506718 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-csi-data-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506753 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfnt\" (UniqueName: \"kubernetes.io/projected/039e68bf-13c6-484b-a4db-229d6d6b5886-kube-api-access-nkfnt\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506784 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b012062-484f-4ad8-99c5-37425cb6e3e1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506818 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-tls\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506849 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e285d56c-2894-48f6-987d-217b4efd8f6e-trusted-ca\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506881 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzwkr\" (UniqueName: \"kubernetes.io/projected/0ac75663-d01c-4122-b30a-faf65d9a063a-kube-api-access-rzwkr\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506925 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/192b6105-6538-462c-8b1c-3a1a69cea50d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506958 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.506987 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3eda937-2ad6-4192-bd7f-c04b1697636e-cert\") pod \"ingress-canary-z2jtl\" (UID: \"a3eda937-2ad6-4192-bd7f-c04b1697636e\") " pod="openshift-ingress-canary/ingress-canary-z2jtl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507020 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns925\" (UniqueName: \"kubernetes.io/projected/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-kube-api-access-ns925\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507053 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-config\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507085 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507115 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b37fe460-3670-48fc-8eaf-d0bbf8b26557-trusted-ca\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507168 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507202 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-stats-auth\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507286 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507319 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-registration-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507352 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8w29\" (UniqueName: \"kubernetes.io/projected/0c05cbec-c6b7-439f-90b2-589e9068b6eb-kube-api-access-c8w29\") pod \"package-server-manager-789f6589d5-j6k7l\" (UID: \"0c05cbec-c6b7-439f-90b2-589e9068b6eb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507387 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-serving-cert\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507475 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507505 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7e1aeeee-05bb-4755-bc17-7cb87025639a-certs\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507547 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-bound-sa-token\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507619 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4b40324-2628-4351-a48f-c607b5a19114-proxy-tls\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507673 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhck\" (UniqueName: \"kubernetes.io/projected/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-kube-api-access-rdhck\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507708 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c05cbec-c6b7-439f-90b2-589e9068b6eb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j6k7l\" (UID: \"0c05cbec-c6b7-439f-90b2-589e9068b6eb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507741 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/039e68bf-13c6-484b-a4db-229d6d6b5886-audit-dir\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507772 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8422f1bf-0f17-43df-8425-93d34601dedd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507802 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-config-volume\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507874 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-service-ca-bundle\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507912 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e285d56c-2894-48f6-987d-217b4efd8f6e-metrics-tls\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507948 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89374177-14ef-4b9a-938a-a838d6d0aab1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.507993 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6fb576-402e-4972-aa09-089865ce389b-serving-cert\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508029 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29sf\" (UniqueName: \"kubernetes.io/projected/9f6fb576-402e-4972-aa09-089865ce389b-kube-api-access-j29sf\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508062 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-metrics-certs\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508116 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq6f9\" (UniqueName: \"kubernetes.io/projected/b590f5cb-d3a6-43b9-97ef-29ad515ecbc9-kube-api-access-hq6f9\") pod \"downloads-7954f5f757-h7qcn\" (UID: \"b590f5cb-d3a6-43b9-97ef-29ad515ecbc9\") " pod="openshift-console/downloads-7954f5f757-h7qcn" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508149 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6826dee1-4dec-4b7c-88a1-600eb014574c-config-volume\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508206 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508240 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-policies\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508298 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b09b21d-9a25-46cf-92c9-a1e427c068c6-apiservice-cert\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508329 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508393 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-config\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508428 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs75w\" (UniqueName: \"kubernetes.io/projected/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-kube-api-access-xs75w\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.508468 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.008441019 +0000 UTC m=+141.996452531 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508510 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3b09b21d-9a25-46cf-92c9-a1e427c068c6-tmpfs\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508558 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f6fb576-402e-4972-aa09-089865ce389b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508596 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-dir\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508632 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508664 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508701 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l5hg\" (UniqueName: \"kubernetes.io/projected/49e1dd56-37f0-41b8-8afa-d040c5750fac-kube-api-access-5l5hg\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508759 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b09b21d-9a25-46cf-92c9-a1e427c068c6-webhook-cert\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508794 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7ea42bdd-b321-4b32-91d3-0e1cb559ace7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n2kj2\" (UID: \"7ea42bdd-b321-4b32-91d3-0e1cb559ace7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508829 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9192ab47-cd4b-4d49-916f-e1454d517b52-images\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508884 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-serving-cert\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508916 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508953 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fmf2\" (UniqueName: \"kubernetes.io/projected/3b09b21d-9a25-46cf-92c9-a1e427c068c6-kube-api-access-7fmf2\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.508987 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509081 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509127 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509204 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192b6105-6538-462c-8b1c-3a1a69cea50d-config\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509277 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-socket-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509351 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6f2x\" (UniqueName: \"kubernetes.io/projected/3ccbb120-c076-426f-b24b-fe0530b3e056-kube-api-access-z6f2x\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509393 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7e1aeeee-05bb-4755-bc17-7cb87025639a-node-bootstrap-token\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509425 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-service-ca-bundle\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509464 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-trusted-ca\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.509499 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ecb616f-62fc-4ff2-a353-6e08c63581a8-config\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.510433 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/039e68bf-13c6-484b-a4db-229d6d6b5886-node-pullsecrets\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.510721 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/039e68bf-13c6-484b-a4db-229d6d6b5886-audit-dir\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.511047 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ecb616f-62fc-4ff2-a353-6e08c63581a8-config\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.511679 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.512310 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-config\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.513370 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-client-ca\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.513579 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.515148 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-etcd-serving-ca\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.515602 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-service-ca-bundle\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.515738 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-config\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.517680 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.518734 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-audit\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.518734 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-audit-policies\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.518860 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-dir\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.519112 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f6fb576-402e-4972-aa09-089865ce389b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.519750 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.519883 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec69babc-944a-4707-914f-5f1da38d6316-serving-cert\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.520801 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2ecb616f-62fc-4ff2-a353-6e08c63581a8-images\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.520871 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-etcd-client\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.522623 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-audit-dir\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.527815 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-config\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.528335 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-serving-cert\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.528765 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-policies\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.529039 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.529302 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-certificates\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.529819 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.530859 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.531057 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89374177-14ef-4b9a-938a-a838d6d0aab1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.531547 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.532108 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/039e68bf-13c6-484b-a4db-229d6d6b5886-image-import-ca\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.532285 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-encryption-config\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.532456 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.532537 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2ecb616f-62fc-4ff2-a353-6e08c63581a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.532958 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-trusted-ca\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.533506 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.532855 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.533846 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-serving-cert\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.534997 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.535165 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.535789 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.537923 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6fb576-402e-4972-aa09-089865ce389b-serving-cert\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.539708 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/039e68bf-13c6-484b-a4db-229d6d6b5886-etcd-client\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.539830 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.541085 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-encryption-config\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.544119 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-serving-cert\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.545650 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.549589 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.550439 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89374177-14ef-4b9a-938a-a838d6d0aab1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.555683 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-tls\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.564943 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.585058 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.604123 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611050 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b012062-484f-4ad8-99c5-37425cb6e3e1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611078 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37fe460-3670-48fc-8eaf-d0bbf8b26557-config\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611096 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx6p5\" (UniqueName: \"kubernetes.io/projected/9192ab47-cd4b-4d49-916f-e1454d517b52-kube-api-access-dx6p5\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611120 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5a3854ab-c63a-4697-975c-e049cc23e0d3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611136 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljg8c\" (UniqueName: \"kubernetes.io/projected/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-kube-api-access-ljg8c\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611151 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9192ab47-cd4b-4d49-916f-e1454d517b52-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611622 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/85f68684-96fb-43fd-bdd0-384451fb1a58-profile-collector-cert\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611650 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ccbb120-c076-426f-b24b-fe0530b3e056-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611672 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2nl\" (UniqueName: \"kubernetes.io/projected/b37fe460-3670-48fc-8eaf-d0bbf8b26557-kube-api-access-ws2nl\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611688 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9192ab47-cd4b-4d49-916f-e1454d517b52-proxy-tls\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611704 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-csi-data-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611727 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b012062-484f-4ad8-99c5-37425cb6e3e1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611743 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzwkr\" (UniqueName: \"kubernetes.io/projected/0ac75663-d01c-4122-b30a-faf65d9a063a-kube-api-access-rzwkr\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611760 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e285d56c-2894-48f6-987d-217b4efd8f6e-trusted-ca\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611789 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/192b6105-6538-462c-8b1c-3a1a69cea50d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611805 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3eda937-2ad6-4192-bd7f-c04b1697636e-cert\") pod \"ingress-canary-z2jtl\" (UID: \"a3eda937-2ad6-4192-bd7f-c04b1697636e\") " pod="openshift-ingress-canary/ingress-canary-z2jtl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611825 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns925\" (UniqueName: \"kubernetes.io/projected/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-kube-api-access-ns925\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611841 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b37fe460-3670-48fc-8eaf-d0bbf8b26557-trusted-ca\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611860 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8w29\" (UniqueName: \"kubernetes.io/projected/0c05cbec-c6b7-439f-90b2-589e9068b6eb-kube-api-access-c8w29\") pod \"package-server-manager-789f6589d5-j6k7l\" (UID: \"0c05cbec-c6b7-439f-90b2-589e9068b6eb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611876 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-stats-auth\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611898 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-registration-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611913 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7e1aeeee-05bb-4755-bc17-7cb87025639a-certs\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611950 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4b40324-2628-4351-a48f-c607b5a19114-proxy-tls\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611977 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c05cbec-c6b7-439f-90b2-589e9068b6eb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j6k7l\" (UID: \"0c05cbec-c6b7-439f-90b2-589e9068b6eb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.611999 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8422f1bf-0f17-43df-8425-93d34601dedd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612013 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-config-volume\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612029 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e285d56c-2894-48f6-987d-217b4efd8f6e-metrics-tls\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612047 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-metrics-certs\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612076 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6826dee1-4dec-4b7c-88a1-600eb014574c-config-volume\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612093 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612109 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b09b21d-9a25-46cf-92c9-a1e427c068c6-apiservice-cert\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612138 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3b09b21d-9a25-46cf-92c9-a1e427c068c6-tmpfs\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612155 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b09b21d-9a25-46cf-92c9-a1e427c068c6-webhook-cert\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612169 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7ea42bdd-b321-4b32-91d3-0e1cb559ace7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n2kj2\" (UID: \"7ea42bdd-b321-4b32-91d3-0e1cb559ace7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612174 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9192ab47-cd4b-4d49-916f-e1454d517b52-auth-proxy-config\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612188 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612272 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9192ab47-cd4b-4d49-916f-e1454d517b52-images\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612304 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmf2\" (UniqueName: \"kubernetes.io/projected/3b09b21d-9a25-46cf-92c9-a1e427c068c6-kube-api-access-7fmf2\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612328 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192b6105-6538-462c-8b1c-3a1a69cea50d-config\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612371 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-socket-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612401 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6f2x\" (UniqueName: \"kubernetes.io/projected/3ccbb120-c076-426f-b24b-fe0530b3e056-kube-api-access-z6f2x\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612425 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7e1aeeee-05bb-4755-bc17-7cb87025639a-node-bootstrap-token\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612447 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-service-ca-bundle\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612473 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b012062-484f-4ad8-99c5-37425cb6e3e1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612502 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxkqb\" (UniqueName: \"kubernetes.io/projected/ec358d29-e3c8-4f69-a1bc-7879193b026a-kube-api-access-kxkqb\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612522 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ccbb120-c076-426f-b24b-fe0530b3e056-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612545 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b37fe460-3670-48fc-8eaf-d0bbf8b26557-serving-cert\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612567 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/192b6105-6538-462c-8b1c-3a1a69cea50d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612590 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5a3854ab-c63a-4697-975c-e049cc23e0d3-srv-cert\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612609 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pqh\" (UniqueName: \"kubernetes.io/projected/7ea42bdd-b321-4b32-91d3-0e1cb559ace7-kube-api-access-25pqh\") pod \"multus-admission-controller-857f4d67dd-n2kj2\" (UID: \"7ea42bdd-b321-4b32-91d3-0e1cb559ace7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612639 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612661 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlwk2\" (UniqueName: \"kubernetes.io/projected/e285d56c-2894-48f6-987d-217b4efd8f6e-kube-api-access-jlwk2\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612683 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8z8z\" (UniqueName: \"kubernetes.io/projected/37572f7a-2fcf-4d28-993d-cd924c0a78b8-kube-api-access-h8z8z\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng2p7\" (UID: \"37572f7a-2fcf-4d28-993d-cd924c0a78b8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612703 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/85f68684-96fb-43fd-bdd0-384451fb1a58-srv-cert\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612726 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66grx\" (UniqueName: \"kubernetes.io/projected/7e1aeeee-05bb-4755-bc17-7cb87025639a-kube-api-access-66grx\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612749 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/37572f7a-2fcf-4d28-993d-cd924c0a78b8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng2p7\" (UID: \"37572f7a-2fcf-4d28-993d-cd924c0a78b8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612772 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac75663-d01c-4122-b30a-faf65d9a063a-serving-cert\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612794 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fdbp\" (UniqueName: \"kubernetes.io/projected/85f68684-96fb-43fd-bdd0-384451fb1a58-kube-api-access-8fdbp\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612815 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/55518a10-e84e-48fa-bc48-fb357a94f6ea-signing-cabundle\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612838 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6826dee1-4dec-4b7c-88a1-600eb014574c-secret-volume\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612858 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-mountpoint-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612882 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8422f1bf-0f17-43df-8425-93d34601dedd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612928 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-metrics-tls\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612988 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e285d56c-2894-48f6-987d-217b4efd8f6e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613018 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac75663-d01c-4122-b30a-faf65d9a063a-config\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613048 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhj66\" (UniqueName: \"kubernetes.io/projected/a3eda937-2ad6-4192-bd7f-c04b1697636e-kube-api-access-mhj66\") pod \"ingress-canary-z2jtl\" (UID: \"a3eda937-2ad6-4192-bd7f-c04b1697636e\") " pod="openshift-ingress-canary/ingress-canary-z2jtl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613107 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrz4d\" (UniqueName: \"kubernetes.io/projected/5a3854ab-c63a-4697-975c-e049cc23e0d3-kube-api-access-xrz4d\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613129 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-default-certificate\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613152 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7lw4\" (UniqueName: \"kubernetes.io/projected/f4b40324-2628-4351-a48f-c607b5a19114-kube-api-access-w7lw4\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613174 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/55518a10-e84e-48fa-bc48-fb357a94f6ea-signing-key\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613226 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-plugins-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613285 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpzl\" (UniqueName: \"kubernetes.io/projected/aef03b84-8d54-4cff-a236-20478d45a45d-kube-api-access-rlpzl\") pod \"migrator-59844c95c7-nfw9m\" (UID: \"aef03b84-8d54-4cff-a236-20478d45a45d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613315 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8422f1bf-0f17-43df-8425-93d34601dedd-config\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613335 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ccbb120-c076-426f-b24b-fe0530b3e056-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613347 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jh7r\" (UniqueName: \"kubernetes.io/projected/55518a10-e84e-48fa-bc48-fb357a94f6ea-kube-api-access-8jh7r\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613442 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4b40324-2628-4351-a48f-c607b5a19114-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613476 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjfgr\" (UniqueName: \"kubernetes.io/projected/6826dee1-4dec-4b7c-88a1-600eb014574c-kube-api-access-bjfgr\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613510 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbfkd\" (UniqueName: \"kubernetes.io/projected/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-kube-api-access-bbfkd\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.613891 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9192ab47-cd4b-4d49-916f-e1454d517b52-images\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.614222 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/192b6105-6538-462c-8b1c-3a1a69cea50d-config\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.614505 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-socket-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.614649 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e285d56c-2894-48f6-987d-217b4efd8f6e-trusted-ca\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.612333 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-csi-data-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.615376 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b37fe460-3670-48fc-8eaf-d0bbf8b26557-trusted-ca\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.615680 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-service-ca-bundle\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.615767 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/192b6105-6538-462c-8b1c-3a1a69cea50d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.615915 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.115885235 +0000 UTC m=+142.103896807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.615922 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-registration-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.616191 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/85f68684-96fb-43fd-bdd0-384451fb1a58-profile-collector-cert\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.616568 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-mountpoint-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.617230 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac75663-d01c-4122-b30a-faf65d9a063a-config\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.617288 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ec358d29-e3c8-4f69-a1bc-7879193b026a-plugins-dir\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.618169 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8422f1bf-0f17-43df-8425-93d34601dedd-config\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.619177 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.619457 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5a3854ab-c63a-4697-975c-e049cc23e0d3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.619645 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f4b40324-2628-4351-a48f-c607b5a19114-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.620169 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3b09b21d-9a25-46cf-92c9-a1e427c068c6-tmpfs\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.620528 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6826dee1-4dec-4b7c-88a1-600eb014574c-secret-volume\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.620605 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-stats-auth\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.620139 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f4b40324-2628-4351-a48f-c607b5a19114-proxy-tls\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.621001 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b37fe460-3670-48fc-8eaf-d0bbf8b26557-serving-cert\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.621292 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/55518a10-e84e-48fa-bc48-fb357a94f6ea-signing-cabundle\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.621568 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37fe460-3670-48fc-8eaf-d0bbf8b26557-config\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.622070 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.623219 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ac75663-d01c-4122-b30a-faf65d9a063a-serving-cert\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.623399 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9192ab47-cd4b-4d49-916f-e1454d517b52-proxy-tls\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.623914 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3b09b21d-9a25-46cf-92c9-a1e427c068c6-apiservice-cert\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.624114 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6826dee1-4dec-4b7c-88a1-600eb014574c-config-volume\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.624654 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.625840 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e285d56c-2894-48f6-987d-217b4efd8f6e-metrics-tls\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.625899 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5a3854ab-c63a-4697-975c-e049cc23e0d3-srv-cert\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.627451 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7ea42bdd-b321-4b32-91d3-0e1cb559ace7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-n2kj2\" (UID: \"7ea42bdd-b321-4b32-91d3-0e1cb559ace7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.627792 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-default-certificate\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.627915 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8422f1bf-0f17-43df-8425-93d34601dedd-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.630227 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ccbb120-c076-426f-b24b-fe0530b3e056-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.630522 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3b09b21d-9a25-46cf-92c9-a1e427c068c6-webhook-cert\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.631026 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/37572f7a-2fcf-4d28-993d-cd924c0a78b8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng2p7\" (UID: \"37572f7a-2fcf-4d28-993d-cd924c0a78b8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.631689 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-metrics-certs\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.633179 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/55518a10-e84e-48fa-bc48-fb357a94f6ea-signing-key\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.639482 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c05cbec-c6b7-439f-90b2-589e9068b6eb-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-j6k7l\" (UID: \"0c05cbec-c6b7-439f-90b2-589e9068b6eb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.643333 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.651098 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/85f68684-96fb-43fd-bdd0-384451fb1a58-srv-cert\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.663763 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.677431 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b012062-484f-4ad8-99c5-37425cb6e3e1-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.684601 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.703358 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.715002 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.715147 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.215125727 +0000 UTC m=+142.203137209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.715341 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.715764 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.215745834 +0000 UTC m=+142.203757346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.724466 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.738376 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b012062-484f-4ad8-99c5-37425cb6e3e1-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.744094 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.763993 4788 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.784418 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.804307 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.817382 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.817624 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.31759077 +0000 UTC m=+142.305602282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.818195 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.818681 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.318664551 +0000 UTC m=+142.306676013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.819484 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3eda937-2ad6-4192-bd7f-c04b1697636e-cert\") pod \"ingress-canary-z2jtl\" (UID: \"a3eda937-2ad6-4192-bd7f-c04b1697636e\") " pod="openshift-ingress-canary/ingress-canary-z2jtl" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.824036 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.844352 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.865013 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.884219 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.894961 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-config-volume\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.905324 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.919477 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.919715 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.419677173 +0000 UTC m=+142.407688675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.919923 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:19 crc kubenswrapper[4788]: E0219 08:47:19.920367 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.420347173 +0000 UTC m=+142.408358685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.926587 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.933409 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-metrics-tls\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.958756 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfmr8\" (UniqueName: \"kubernetes.io/projected/26d9a486-1abe-4d18-8b80-723c8d25ef89-kube-api-access-xfmr8\") pod \"cluster-samples-operator-665b6dd947-jmrsd\" (UID: \"26d9a486-1abe-4d18-8b80-723c8d25ef89\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" Feb 19 08:47:19 crc kubenswrapper[4788]: I0219 08:47:19.990702 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrxl\" (UniqueName: \"kubernetes.io/projected/31c71a84-dec5-44b7-b970-0e7a7cb39a5e-kube-api-access-xbrxl\") pod \"kube-storage-version-migrator-operator-b67b599dd-p7kds\" (UID: \"31c71a84-dec5-44b7-b970-0e7a7cb39a5e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.007524 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d49ed318-47b5-4101-b4b9-09dda3667dd3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.021566 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.021706 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.521678204 +0000 UTC m=+142.509689706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.022052 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.022449 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.522432745 +0000 UTC m=+142.510444257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.029078 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-684xx\" (UniqueName: \"kubernetes.io/projected/6b1add9a-33b1-4eec-b8fa-1b42f94b04d1-kube-api-access-684xx\") pod \"openshift-apiserver-operator-796bbdcf4f-286ds\" (UID: \"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.044293 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rc2\" (UniqueName: \"kubernetes.io/projected/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-kube-api-access-r4rc2\") pod \"console-f9d7485db-jr6pt\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.070982 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9w9z\" (UniqueName: \"kubernetes.io/projected/104f8fd7-4fdc-4e3b-8028-30c2630091b6-kube-api-access-g9w9z\") pod \"etcd-operator-b45778765-5cmw7\" (UID: \"104f8fd7-4fdc-4e3b-8028-30c2630091b6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.093914 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9945x\" (UniqueName: \"kubernetes.io/projected/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-kube-api-access-9945x\") pod \"controller-manager-879f6c89f-t7bpz\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.102941 4788 request.go:700] Waited for 1.918294561s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.103585 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96rxm\" (UniqueName: \"kubernetes.io/projected/2bebdf2c-5451-43b9-b9fd-dc182d5edde1-kube-api-access-96rxm\") pod \"machine-approver-56656f9798-kqbfz\" (UID: \"2bebdf2c-5451-43b9-b9fd-dc182d5edde1\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.123572 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.123829 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.623805208 +0000 UTC m=+142.611816680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.123979 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.124517 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.624495278 +0000 UTC m=+142.612506780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.130463 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mtpx\" (UniqueName: \"kubernetes.io/projected/d49ed318-47b5-4101-b4b9-09dda3667dd3-kube-api-access-9mtpx\") pod \"cluster-image-registry-operator-dc59b4c8b-pcjq6\" (UID: \"d49ed318-47b5-4101-b4b9-09dda3667dd3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.144046 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.148546 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmkqr\" (UniqueName: \"kubernetes.io/projected/6e569289-11c5-4577-92e6-c8031eda90ec-kube-api-access-cmkqr\") pod \"dns-operator-744455d44c-zhsg9\" (UID: \"6e569289-11c5-4577-92e6-c8031eda90ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.155167 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.166079 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.181360 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7e1aeeee-05bb-4755-bc17-7cb87025639a-node-bootstrap-token\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.184471 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.188097 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.200007 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7e1aeeee-05bb-4755-bc17-7cb87025639a-certs\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.205561 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" Feb 19 08:47:20 crc kubenswrapper[4788]: W0219 08:47:20.210760 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bebdf2c_5451_43b9_b9fd_dc182d5edde1.slice/crio-fb47af354b62cae841b0d610ba2555cf96ca75256cde85fb2958ccbf57fee0ed WatchSource:0}: Error finding container fb47af354b62cae841b0d610ba2555cf96ca75256cde85fb2958ccbf57fee0ed: Status 404 returned error can't find the container with id fb47af354b62cae841b0d610ba2555cf96ca75256cde85fb2958ccbf57fee0ed Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.221327 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.227991 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.228347 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.72830084 +0000 UTC m=+142.716312352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.229049 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.229696 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.729673539 +0000 UTC m=+142.717685071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.234673 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.253081 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.258071 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs75w\" (UniqueName: \"kubernetes.io/projected/6e744116-7d2b-4c8d-847b-8b0dc683e2d8-kube-api-access-xs75w\") pod \"authentication-operator-69f744f599-x5fpc\" (UID: \"6e744116-7d2b-4c8d-847b-8b0dc683e2d8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.274454 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-bound-sa-token\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.277818 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.284434 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.287734 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq6f9\" (UniqueName: \"kubernetes.io/projected/b590f5cb-d3a6-43b9-97ef-29ad515ecbc9-kube-api-access-hq6f9\") pod \"downloads-7954f5f757-h7qcn\" (UID: \"b590f5cb-d3a6-43b9-97ef-29ad515ecbc9\") " pod="openshift-console/downloads-7954f5f757-h7qcn" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.306411 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d865\" (UniqueName: \"kubernetes.io/projected/2ecb616f-62fc-4ff2-a353-6e08c63581a8-kube-api-access-8d865\") pod \"machine-api-operator-5694c8668f-9s2zg\" (UID: \"2ecb616f-62fc-4ff2-a353-6e08c63581a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.311542 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.323903 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhck\" (UniqueName: \"kubernetes.io/projected/3c317fc6-2b8f-40ea-97a8-03a1da391b8f-kube-api-access-rdhck\") pod \"apiserver-7bbb656c7d-bpl9k\" (UID: \"3c317fc6-2b8f-40ea-97a8-03a1da391b8f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.330541 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.331579 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.831551916 +0000 UTC m=+142.819563428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.342969 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfnt\" (UniqueName: \"kubernetes.io/projected/039e68bf-13c6-484b-a4db-229d6d6b5886-kube-api-access-nkfnt\") pod \"apiserver-76f77b778f-qgw8f\" (UID: \"039e68bf-13c6-484b-a4db-229d6d6b5886\") " pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.364497 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.376648 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29sf\" (UniqueName: \"kubernetes.io/projected/9f6fb576-402e-4972-aa09-089865ce389b-kube-api-access-j29sf\") pod \"openshift-config-operator-7777fb866f-8vngm\" (UID: \"9f6fb576-402e-4972-aa09-089865ce389b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.380300 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.382488 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-989lg\" (UniqueName: \"kubernetes.io/projected/ec69babc-944a-4707-914f-5f1da38d6316-kube-api-access-989lg\") pod \"route-controller-manager-6576b87f9c-khmhl\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.403206 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l5hg\" (UniqueName: \"kubernetes.io/projected/49e1dd56-37f0-41b8-8afa-d040c5750fac-kube-api-access-5l5hg\") pod \"oauth-openshift-558db77b4-xw477\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.410384 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.423859 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdn9\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-kube-api-access-lkdn9\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.435800 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.436229 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.436959 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:20.936939563 +0000 UTC m=+142.924951075 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.440745 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljg8c\" (UniqueName: \"kubernetes.io/projected/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-kube-api-access-ljg8c\") pod \"marketplace-operator-79b997595-jdjtx\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.457420 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.466029 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx6p5\" (UniqueName: \"kubernetes.io/projected/9192ab47-cd4b-4d49-916f-e1454d517b52-kube-api-access-dx6p5\") pod \"machine-config-operator-74547568cd-fxs8h\" (UID: \"9192ab47-cd4b-4d49-916f-e1454d517b52\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.470078 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" event={"ID":"2bebdf2c-5451-43b9-b9fd-dc182d5edde1","Type":"ContainerStarted","Data":"fb47af354b62cae841b0d610ba2555cf96ca75256cde85fb2958ccbf57fee0ed"} Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.481894 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2nl\" (UniqueName: \"kubernetes.io/projected/b37fe460-3670-48fc-8eaf-d0bbf8b26557-kube-api-access-ws2nl\") pod \"console-operator-58897d9998-brvcp\" (UID: \"b37fe460-3670-48fc-8eaf-d0bbf8b26557\") " pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.497053 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t7bpz"] Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.500008 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzwkr\" (UniqueName: \"kubernetes.io/projected/0ac75663-d01c-4122-b30a-faf65d9a063a-kube-api-access-rzwkr\") pod \"service-ca-operator-777779d784-sm7pt\" (UID: \"0ac75663-d01c-4122-b30a-faf65d9a063a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:20 crc kubenswrapper[4788]: W0219 08:47:20.512453 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4f8048f_eccd_44c1_b8b7_63aace0ec7f7.slice/crio-0b0ee3426058dbe9a8dd9f849b4d56bc740a08513892307e285e1bf6d93e87da WatchSource:0}: Error finding container 0b0ee3426058dbe9a8dd9f849b4d56bc740a08513892307e285e1bf6d93e87da: Status 404 returned error can't find the container with id 0b0ee3426058dbe9a8dd9f849b4d56bc740a08513892307e285e1bf6d93e87da Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.537988 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.538692 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.038678316 +0000 UTC m=+143.026689778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.540927 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.552124 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-h7qcn" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.554316 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmf2\" (UniqueName: \"kubernetes.io/projected/3b09b21d-9a25-46cf-92c9-a1e427c068c6-kube-api-access-7fmf2\") pod \"packageserver-d55dfcdfc-ltdd8\" (UID: \"3b09b21d-9a25-46cf-92c9-a1e427c068c6\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.564371 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.565980 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns925\" (UniqueName: \"kubernetes.io/projected/164b83b5-7bd4-4bea-8f7e-76c83c46a4b0-kube-api-access-ns925\") pod \"router-default-5444994796-8tvwh\" (UID: \"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0\") " pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.572717 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.578499 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8w29\" (UniqueName: \"kubernetes.io/projected/0c05cbec-c6b7-439f-90b2-589e9068b6eb-kube-api-access-c8w29\") pod \"package-server-manager-789f6589d5-j6k7l\" (UID: \"0c05cbec-c6b7-439f-90b2-589e9068b6eb\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.589117 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbfkd\" (UniqueName: \"kubernetes.io/projected/09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b-kube-api-access-bbfkd\") pod \"dns-default-hbn8p\" (UID: \"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b\") " pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.609784 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds"] Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.614489 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6f2x\" (UniqueName: \"kubernetes.io/projected/3ccbb120-c076-426f-b24b-fe0530b3e056-kube-api-access-z6f2x\") pod \"openshift-controller-manager-operator-756b6f6bc6-bdz4t\" (UID: \"3ccbb120-c076-426f-b24b-fe0530b3e056\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.625581 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8422f1bf-0f17-43df-8425-93d34601dedd-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bmknd\" (UID: \"8422f1bf-0f17-43df-8425-93d34601dedd\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.626498 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.634592 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.639659 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b012062-484f-4ad8-99c5-37425cb6e3e1-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-q56wb\" (UID: \"2b012062-484f-4ad8-99c5-37425cb6e3e1\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.639704 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.640016 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.140000067 +0000 UTC m=+143.128011539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.652686 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.661651 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxkqb\" (UniqueName: \"kubernetes.io/projected/ec358d29-e3c8-4f69-a1bc-7879193b026a-kube-api-access-kxkqb\") pod \"csi-hostpathplugin-9c2k4\" (UID: \"ec358d29-e3c8-4f69-a1bc-7879193b026a\") " pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.679142 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.679193 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66grx\" (UniqueName: \"kubernetes.io/projected/7e1aeeee-05bb-4755-bc17-7cb87025639a-kube-api-access-66grx\") pod \"machine-config-server-n6shm\" (UID: \"7e1aeeee-05bb-4755-bc17-7cb87025639a\") " pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.698082 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/192b6105-6538-462c-8b1c-3a1a69cea50d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4mmpf\" (UID: \"192b6105-6538-462c-8b1c-3a1a69cea50d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.698787 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.718904 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.728528 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8z8z\" (UniqueName: \"kubernetes.io/projected/37572f7a-2fcf-4d28-993d-cd924c0a78b8-kube-api-access-h8z8z\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng2p7\" (UID: \"37572f7a-2fcf-4d28-993d-cd924c0a78b8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.740011 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhj66\" (UniqueName: \"kubernetes.io/projected/a3eda937-2ad6-4192-bd7f-c04b1697636e-kube-api-access-mhj66\") pod \"ingress-canary-z2jtl\" (UID: \"a3eda937-2ad6-4192-bd7f-c04b1697636e\") " pod="openshift-ingress-canary/ingress-canary-z2jtl" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.740840 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.741019 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.240990939 +0000 UTC m=+143.229002411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.741303 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.741604 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.241598286 +0000 UTC m=+143.229609758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: W0219 08:47:20.744353 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod164b83b5_7bd4_4bea_8f7e_76c83c46a4b0.slice/crio-8f69c8fc6c429fa029dd84fabc5691025f810dd56ec3cab544561e970e853753 WatchSource:0}: Error finding container 8f69c8fc6c429fa029dd84fabc5691025f810dd56ec3cab544561e970e853753: Status 404 returned error can't find the container with id 8f69c8fc6c429fa029dd84fabc5691025f810dd56ec3cab544561e970e853753 Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.750111 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.761166 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.764758 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e285d56c-2894-48f6-987d-217b4efd8f6e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.779718 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.786917 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrz4d\" (UniqueName: \"kubernetes.io/projected/5a3854ab-c63a-4697-975c-e049cc23e0d3-kube-api-access-xrz4d\") pod \"olm-operator-6b444d44fb-sfqbb\" (UID: \"5a3854ab-c63a-4697-975c-e049cc23e0d3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.797655 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.803853 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpzl\" (UniqueName: \"kubernetes.io/projected/aef03b84-8d54-4cff-a236-20478d45a45d-kube-api-access-rlpzl\") pod \"migrator-59844c95c7-nfw9m\" (UID: \"aef03b84-8d54-4cff-a236-20478d45a45d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.820575 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.821027 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7lw4\" (UniqueName: \"kubernetes.io/projected/f4b40324-2628-4351-a48f-c607b5a19114-kube-api-access-w7lw4\") pod \"machine-config-controller-84d6567774-kq7pj\" (UID: \"f4b40324-2628-4351-a48f-c607b5a19114\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.835066 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z2jtl" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.841201 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlwk2\" (UniqueName: \"kubernetes.io/projected/e285d56c-2894-48f6-987d-217b4efd8f6e-kube-api-access-jlwk2\") pod \"ingress-operator-5b745b69d9-qmnq5\" (UID: \"e285d56c-2894-48f6-987d-217b4efd8f6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.843059 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.843122 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.843789 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.343773421 +0000 UTC m=+143.331784893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.853078 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-n6shm" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.859679 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jh7r\" (UniqueName: \"kubernetes.io/projected/55518a10-e84e-48fa-bc48-fb357a94f6ea-kube-api-access-8jh7r\") pod \"service-ca-9c57cc56f-s6bkf\" (UID: \"55518a10-e84e-48fa-bc48-fb357a94f6ea\") " pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.876391 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjfgr\" (UniqueName: \"kubernetes.io/projected/6826dee1-4dec-4b7c-88a1-600eb014574c-kube-api-access-bjfgr\") pod \"collect-profiles-29524845-8nftq\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.899587 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fdbp\" (UniqueName: \"kubernetes.io/projected/85f68684-96fb-43fd-bdd0-384451fb1a58-kube-api-access-8fdbp\") pod \"catalog-operator-68c6474976-zq4ss\" (UID: \"85f68684-96fb-43fd-bdd0-384451fb1a58\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.918697 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pqh\" (UniqueName: \"kubernetes.io/projected/7ea42bdd-b321-4b32-91d3-0e1cb559ace7-kube-api-access-25pqh\") pod \"multus-admission-controller-857f4d67dd-n2kj2\" (UID: \"7ea42bdd-b321-4b32-91d3-0e1cb559ace7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.945398 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:20 crc kubenswrapper[4788]: E0219 08:47:20.945779 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.445764052 +0000 UTC m=+143.433775524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.951707 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.959518 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.977986 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6"] Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.986724 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd"] Feb 19 08:47:20 crc kubenswrapper[4788]: I0219 08:47:20.991440 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.002610 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.014906 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.032943 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jr6pt"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.043528 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.066913 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.068446 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.072903 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.073388 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.573371235 +0000 UTC m=+143.561382707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.083872 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5cmw7"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.088814 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.090075 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-9s2zg"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.091816 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.113604 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:21 crc kubenswrapper[4788]: W0219 08:47:21.137211 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7108fd8d_57c8_42b0_9fe2_08ca6b33b2de.slice/crio-791fada88e208cfadeb5033fc2ed56c28681c9f00b0f58d630deda63ba221aaf WatchSource:0}: Error finding container 791fada88e208cfadeb5033fc2ed56c28681c9f00b0f58d630deda63ba221aaf: Status 404 returned error can't find the container with id 791fada88e208cfadeb5033fc2ed56c28681c9f00b0f58d630deda63ba221aaf Feb 19 08:47:21 crc kubenswrapper[4788]: W0219 08:47:21.158670 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ecb616f_62fc_4ff2_a353_6e08c63581a8.slice/crio-79e8c2c248ad90bf6c442efaf01a8c12123762e862d04e246650f8a3f8eaf882 WatchSource:0}: Error finding container 79e8c2c248ad90bf6c442efaf01a8c12123762e862d04e246650f8a3f8eaf882: Status 404 returned error can't find the container with id 79e8c2c248ad90bf6c442efaf01a8c12123762e862d04e246650f8a3f8eaf882 Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.174555 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.174946 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.674929113 +0000 UTC m=+143.662940585 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.220034 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zhsg9"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.245799 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.245847 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qgw8f"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.275783 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.275925 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.775906524 +0000 UTC m=+143.763917986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.280178 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.780164366 +0000 UTC m=+143.768175838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.282855 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:21 crc kubenswrapper[4788]: W0219 08:47:21.291389 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e569289_11c5_4577_92e6_c8031eda90ec.slice/crio-ac9087118402efb060cf1b4de5940516cc309b75add59ef4f1883cd2f8ce7eab WatchSource:0}: Error finding container ac9087118402efb060cf1b4de5940516cc309b75add59ef4f1883cd2f8ce7eab: Status 404 returned error can't find the container with id ac9087118402efb060cf1b4de5940516cc309b75add59ef4f1883cd2f8ce7eab Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.328462 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-x5fpc"] Feb 19 08:47:21 crc kubenswrapper[4788]: W0219 08:47:21.329458 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod039e68bf_13c6_484b_a4db_229d6d6b5886.slice/crio-6ffcbc77f3ae6e8a875b23e14e48ddcffaa64be182dd6c0aac655cdeba72cb1e WatchSource:0}: Error finding container 6ffcbc77f3ae6e8a875b23e14e48ddcffaa64be182dd6c0aac655cdeba72cb1e: Status 404 returned error can't find the container with id 6ffcbc77f3ae6e8a875b23e14e48ddcffaa64be182dd6c0aac655cdeba72cb1e Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.332395 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xw477"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.377430 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-brvcp"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.380325 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.382127 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jdjtx"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.383916 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.384234 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.884219375 +0000 UTC m=+143.872230847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: W0219 08:47:21.399978 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec69babc_944a_4707_914f_5f1da38d6316.slice/crio-e7863073ffe9e15cbb0804234f937e04176f35c0e559583e3ec67bcd98eb1527 WatchSource:0}: Error finding container e7863073ffe9e15cbb0804234f937e04176f35c0e559583e3ec67bcd98eb1527: Status 404 returned error can't find the container with id e7863073ffe9e15cbb0804234f937e04176f35c0e559583e3ec67bcd98eb1527 Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.460526 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-8vngm"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.470587 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.485487 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.485841 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:21.985827394 +0000 UTC m=+143.973838866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.489858 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-h7qcn"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.508629 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" event={"ID":"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1","Type":"ContainerStarted","Data":"075d539127d81c14ad269e0e59ea42ba825d98f96bff4ebf5a86c5767803d02d"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.508674 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" event={"ID":"6b1add9a-33b1-4eec-b8fa-1b42f94b04d1","Type":"ContainerStarted","Data":"b7bd2de257cf5654efec034ec0ac6f7216d65badcad6a65eac08bff0df6a641a"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.513972 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" event={"ID":"26d9a486-1abe-4d18-8b80-723c8d25ef89","Type":"ContainerStarted","Data":"89ce2f11cce9acfd9123434f78c2d2019e65c569e58dd8cb311445554ba2aa9e"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.515179 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-brvcp" event={"ID":"b37fe460-3670-48fc-8eaf-d0bbf8b26557","Type":"ContainerStarted","Data":"1cef9e412c9ed9ceef7d1f2a84a510266df6c14869bf88af0a1d582e4dfb55da"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.536107 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jr6pt" event={"ID":"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de","Type":"ContainerStarted","Data":"791fada88e208cfadeb5033fc2ed56c28681c9f00b0f58d630deda63ba221aaf"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.579650 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" event={"ID":"3ccbb120-c076-426f-b24b-fe0530b3e056","Type":"ContainerStarted","Data":"c64d31ab25799bfbe210c8da0be09570054364a6c5c27ccb313fc51813b8852e"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.586024 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.586424 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.086404264 +0000 UTC m=+144.074415736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.587411 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.587724 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.087716691 +0000 UTC m=+144.075728163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.609214 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8tvwh" event={"ID":"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0","Type":"ContainerStarted","Data":"989c1b35078567480b206b9b471cd04f827545a4db2af179ee462831792d8214"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.609274 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-8tvwh" event={"ID":"164b83b5-7bd4-4bea-8f7e-76c83c46a4b0","Type":"ContainerStarted","Data":"8f69c8fc6c429fa029dd84fabc5691025f810dd56ec3cab544561e970e853753"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.623939 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" event={"ID":"3c317fc6-2b8f-40ea-97a8-03a1da391b8f","Type":"ContainerStarted","Data":"aa2fd260187c6ebcb721a03102fc27a3338380134c71e4af91b873d0859460e0"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.629922 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" event={"ID":"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7","Type":"ContainerStarted","Data":"b9b6ab176119d897c2f9a3e5060602f1c6497159c52d9db471ff155bfeaa6cfe"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.629975 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" event={"ID":"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7","Type":"ContainerStarted","Data":"0b0ee3426058dbe9a8dd9f849b4d56bc740a08513892307e285e1bf6d93e87da"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.630455 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.633685 4788 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-t7bpz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.633724 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" podUID="c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.636227 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" event={"ID":"104f8fd7-4fdc-4e3b-8028-30c2630091b6","Type":"ContainerStarted","Data":"2cfd1d62bf3170a1008b0686869358b2cebaf58c8ba5616c537f8dcfbe621971"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.639028 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" event={"ID":"2bebdf2c-5451-43b9-b9fd-dc182d5edde1","Type":"ContainerStarted","Data":"a0305a1c83788fc59a6765db1ff94301c7ce01aaa8652b1bd3b9a39054be9769"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.642235 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n6shm" event={"ID":"7e1aeeee-05bb-4755-bc17-7cb87025639a","Type":"ContainerStarted","Data":"5ad819e68e60833ee430bf35bb13af7351e27039ef15eb29b51e3fc6af98ce28"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.642286 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-n6shm" event={"ID":"7e1aeeee-05bb-4755-bc17-7cb87025639a","Type":"ContainerStarted","Data":"a82825ce609849327ddb84f40878a824ed8dca802becd25d6e8f76139ab716f5"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.647548 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" event={"ID":"d49ed318-47b5-4101-b4b9-09dda3667dd3","Type":"ContainerStarted","Data":"d537288b823f6e0cada6c9d1910022867513effaf35af22c02c88cceea92e08e"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.653205 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" event={"ID":"039e68bf-13c6-484b-a4db-229d6d6b5886","Type":"ContainerStarted","Data":"6ffcbc77f3ae6e8a875b23e14e48ddcffaa64be182dd6c0aac655cdeba72cb1e"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.658638 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" event={"ID":"2ecb616f-62fc-4ff2-a353-6e08c63581a8","Type":"ContainerStarted","Data":"79e8c2c248ad90bf6c442efaf01a8c12123762e862d04e246650f8a3f8eaf882"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.660932 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" event={"ID":"6e569289-11c5-4577-92e6-c8031eda90ec","Type":"ContainerStarted","Data":"ac9087118402efb060cf1b4de5940516cc309b75add59ef4f1883cd2f8ce7eab"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.664055 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" event={"ID":"ec69babc-944a-4707-914f-5f1da38d6316","Type":"ContainerStarted","Data":"e7863073ffe9e15cbb0804234f937e04176f35c0e559583e3ec67bcd98eb1527"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.673023 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" event={"ID":"31c71a84-dec5-44b7-b970-0e7a7cb39a5e","Type":"ContainerStarted","Data":"79022d4336c2e424dc24b8af8dc2e2375224600583fd70a143584c6baac59723"} Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.680149 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.688626 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.690115 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.190095402 +0000 UTC m=+144.178106864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.690452 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:21 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:21 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:21 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.690490 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.733470 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.738533 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.739404 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.748719 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hbn8p"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.774043 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.782324 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.787336 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.791734 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.792032 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.29202085 +0000 UTC m=+144.280032322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.835804 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9c2k4"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.854955 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z2jtl"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.882413 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.889027 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.895049 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.896076 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.396058719 +0000 UTC m=+144.384070191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.896646 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.896957 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.396949275 +0000 UTC m=+144.384960747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.900185 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-s6bkf"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.904536 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5"] Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.907008 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj"] Feb 19 08:47:21 crc kubenswrapper[4788]: W0219 08:47:21.920162 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37572f7a_2fcf_4d28_993d_cd924c0a78b8.slice/crio-9fe4dff49bc213d23c76d074a1ef941a7bd1095b27f868cf28dbc7fbe482c0b9 WatchSource:0}: Error finding container 9fe4dff49bc213d23c76d074a1ef941a7bd1095b27f868cf28dbc7fbe482c0b9: Status 404 returned error can't find the container with id 9fe4dff49bc213d23c76d074a1ef941a7bd1095b27f868cf28dbc7fbe482c0b9 Feb 19 08:47:21 crc kubenswrapper[4788]: W0219 08:47:21.991155 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b40324_2628_4351_a48f_c607b5a19114.slice/crio-535dabac0541824b6eb4be131242e06dbeae5626dfd7d8fead4c6254dcb0f11b WatchSource:0}: Error finding container 535dabac0541824b6eb4be131242e06dbeae5626dfd7d8fead4c6254dcb0f11b: Status 404 returned error can't find the container with id 535dabac0541824b6eb4be131242e06dbeae5626dfd7d8fead4c6254dcb0f11b Feb 19 08:47:21 crc kubenswrapper[4788]: W0219 08:47:21.996216 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55518a10_e84e_48fa_bc48_fb357a94f6ea.slice/crio-773bdf475115a7b3347271981919d65b39030fb8d42c51bd92a06526130c1976 WatchSource:0}: Error finding container 773bdf475115a7b3347271981919d65b39030fb8d42c51bd92a06526130c1976: Status 404 returned error can't find the container with id 773bdf475115a7b3347271981919d65b39030fb8d42c51bd92a06526130c1976 Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.997718 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.998004 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.497985348 +0000 UTC m=+144.485996830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:21 crc kubenswrapper[4788]: I0219 08:47:21.998042 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:21 crc kubenswrapper[4788]: E0219 08:47:21.998496 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.498487392 +0000 UTC m=+144.486498864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.106028 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.106435 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.606415502 +0000 UTC m=+144.594426974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.128121 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq"] Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.150167 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.150234 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.152395 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb"] Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.160174 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-n2kj2"] Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.172163 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss"] Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.207705 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.208090 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.708078403 +0000 UTC m=+144.696089875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: W0219 08:47:22.283507 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ea42bdd_b321_4b32_91d3_0e1cb559ace7.slice/crio-62b62d3a4378f6fdc0726e4a1cd02d4027b1e0cbd5d96cd8030e9e3a376cac39 WatchSource:0}: Error finding container 62b62d3a4378f6fdc0726e4a1cd02d4027b1e0cbd5d96cd8030e9e3a376cac39: Status 404 returned error can't find the container with id 62b62d3a4378f6fdc0726e4a1cd02d4027b1e0cbd5d96cd8030e9e3a376cac39 Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.311934 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.312490 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.812472012 +0000 UTC m=+144.800483484 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.413916 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.414349 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:22.914320148 +0000 UTC m=+144.902331620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.515026 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.515344 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.01533071 +0000 UTC m=+145.003342182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.618839 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.619748 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.119732499 +0000 UTC m=+145.107743971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.692262 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:22 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:22 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:22 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.692321 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.720152 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.720487 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.220469832 +0000 UTC m=+145.208481304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.731323 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-8tvwh" podStartSLOduration=122.731295332 podStartE2EDuration="2m2.731295332s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:22.624578837 +0000 UTC m=+144.612590309" watchObservedRunningTime="2026-02-19 08:47:22.731295332 +0000 UTC m=+144.719306804" Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.816201 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" event={"ID":"e285d56c-2894-48f6-987d-217b4efd8f6e","Type":"ContainerStarted","Data":"f98a3f0b584239a97a286f615f4b6b233e92568c2244e5ef2ac2321050761ec3"} Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.824880 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-286ds" podStartSLOduration=122.824857551 podStartE2EDuration="2m2.824857551s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:22.734997598 +0000 UTC m=+144.723009070" watchObservedRunningTime="2026-02-19 08:47:22.824857551 +0000 UTC m=+144.812869023" Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.825137 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" podStartSLOduration=122.825132858 podStartE2EDuration="2m2.825132858s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:22.823913053 +0000 UTC m=+144.811924525" watchObservedRunningTime="2026-02-19 08:47:22.825132858 +0000 UTC m=+144.813144330" Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.825175 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.825595 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.325577661 +0000 UTC m=+145.313589133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.847308 4788 generic.go:334] "Generic (PLEG): container finished" podID="039e68bf-13c6-484b-a4db-229d6d6b5886" containerID="4c8bdbd2c0334a31b69baf3c8cb9fb8b154d7284ccad6203baa03a3dcad41fce" exitCode=0 Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.847759 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" event={"ID":"039e68bf-13c6-484b-a4db-229d6d6b5886","Type":"ContainerDied","Data":"4c8bdbd2c0334a31b69baf3c8cb9fb8b154d7284ccad6203baa03a3dcad41fce"} Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.861485 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-n6shm" podStartSLOduration=4.861466349 podStartE2EDuration="4.861466349s" podCreationTimestamp="2026-02-19 08:47:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:22.860503001 +0000 UTC m=+144.848514473" watchObservedRunningTime="2026-02-19 08:47:22.861466349 +0000 UTC m=+144.849477821" Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.868906 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" event={"ID":"6e569289-11c5-4577-92e6-c8031eda90ec","Type":"ContainerStarted","Data":"5bc0eabbdbef97f6cc7bc2e100bef2c5210bdfab7c446b161400586bdcb632db"} Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.879711 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" event={"ID":"37572f7a-2fcf-4d28-993d-cd924c0a78b8","Type":"ContainerStarted","Data":"1d87c7db4050d61f15514a2e2e7b83080413d08c02c695ccdeecd718cf6a82ac"} Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.883492 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" event={"ID":"37572f7a-2fcf-4d28-993d-cd924c0a78b8","Type":"ContainerStarted","Data":"9fe4dff49bc213d23c76d074a1ef941a7bd1095b27f868cf28dbc7fbe482c0b9"} Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.926177 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:22 crc kubenswrapper[4788]: E0219 08:47:22.936431 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.436407754 +0000 UTC m=+145.424419226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.948441 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng2p7" podStartSLOduration=122.948415498 podStartE2EDuration="2m2.948415498s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:22.946195804 +0000 UTC m=+144.934207276" watchObservedRunningTime="2026-02-19 08:47:22.948415498 +0000 UTC m=+144.936426970" Feb 19 08:47:22 crc kubenswrapper[4788]: I0219 08:47:22.968794 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" event={"ID":"85f68684-96fb-43fd-bdd0-384451fb1a58","Type":"ContainerStarted","Data":"ca11d287a228e1e9e878979a63e22fed526e78fe6a668f93166f613f3960c4dc"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.027685 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.029819 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.529806588 +0000 UTC m=+145.517818060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.042278 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" event={"ID":"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6","Type":"ContainerStarted","Data":"9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.042314 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" event={"ID":"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6","Type":"ContainerStarted","Data":"5bbd68de5318f9b02c4fcc865f459413989088ff7da20d8d8801de3659d85113"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.042621 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.049891 4788 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jdjtx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.049924 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" podUID="90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.076676 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" podStartSLOduration=123.07666111 podStartE2EDuration="2m3.07666111s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.07527544 +0000 UTC m=+145.063286922" watchObservedRunningTime="2026-02-19 08:47:23.07666111 +0000 UTC m=+145.064672582" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.113801 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-h7qcn" event={"ID":"b590f5cb-d3a6-43b9-97ef-29ad515ecbc9","Type":"ContainerStarted","Data":"70f2dca0287a4b559c387752caf1e1750533fd1763c212b6447479e2299c3779"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.113871 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-h7qcn" event={"ID":"b590f5cb-d3a6-43b9-97ef-29ad515ecbc9","Type":"ContainerStarted","Data":"27e6374381498743168296870e5bab394609e5c385a88a8a82d088ce83960c07"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.114039 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-h7qcn" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.125778 4788 patch_prober.go:28] interesting pod/downloads-7954f5f757-h7qcn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.125844 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h7qcn" podUID="b590f5cb-d3a6-43b9-97ef-29ad515ecbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.128629 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.129314 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.629298167 +0000 UTC m=+145.617309639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.137164 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-brvcp" event={"ID":"b37fe460-3670-48fc-8eaf-d0bbf8b26557","Type":"ContainerStarted","Data":"b0b5c751c7e41a84dd221138e53bf5907ba602285427ff3986a259eea540bf58"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.138403 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.142136 4788 patch_prober.go:28] interesting pod/console-operator-58897d9998-brvcp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.142216 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-brvcp" podUID="b37fe460-3670-48fc-8eaf-d0bbf8b26557" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.205880 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jr6pt" event={"ID":"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de","Type":"ContainerStarted","Data":"0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.210032 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-h7qcn" podStartSLOduration=123.210005347 podStartE2EDuration="2m3.210005347s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.158526774 +0000 UTC m=+145.146538246" watchObservedRunningTime="2026-02-19 08:47:23.210005347 +0000 UTC m=+145.198016819" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.210846 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-brvcp" podStartSLOduration=123.210839001 podStartE2EDuration="2m3.210839001s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.198644022 +0000 UTC m=+145.186655494" watchObservedRunningTime="2026-02-19 08:47:23.210839001 +0000 UTC m=+145.198850473" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.225053 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" event={"ID":"2b012062-484f-4ad8-99c5-37425cb6e3e1","Type":"ContainerStarted","Data":"cb4aad239f0d84c4f6608ec49a71f61bd4cf58c2a93d970c921f548ecd04c2e0"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.231517 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.240020 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.739998506 +0000 UTC m=+145.728009978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.250793 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" event={"ID":"3b09b21d-9a25-46cf-92c9-a1e427c068c6","Type":"ContainerStarted","Data":"66aea32ebe2345f8213cfcf7c6302416dd6d3c568bab9d0febca46a54ec1e462"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.251344 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.255056 4788 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ltdd8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.255299 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" podUID="3b09b21d-9a25-46cf-92c9-a1e427c068c6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.258947 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jr6pt" podStartSLOduration=123.258921338 podStartE2EDuration="2m3.258921338s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.250558479 +0000 UTC m=+145.238569951" watchObservedRunningTime="2026-02-19 08:47:23.258921338 +0000 UTC m=+145.246932810" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.313123 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" event={"ID":"ec69babc-944a-4707-914f-5f1da38d6316","Type":"ContainerStarted","Data":"6c4979e728fffe4a68cb6f93d718752956537d933cd7d6d40d17f0e933deaa5b"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.328777 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" event={"ID":"55518a10-e84e-48fa-bc48-fb357a94f6ea","Type":"ContainerStarted","Data":"773bdf475115a7b3347271981919d65b39030fb8d42c51bd92a06526130c1976"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.358396 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" podStartSLOduration=123.358365955 podStartE2EDuration="2m3.358365955s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.285159519 +0000 UTC m=+145.273170991" watchObservedRunningTime="2026-02-19 08:47:23.358365955 +0000 UTC m=+145.346377427" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.365037 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.374165 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.375190 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" event={"ID":"f4b40324-2628-4351-a48f-c607b5a19114","Type":"ContainerStarted","Data":"535dabac0541824b6eb4be131242e06dbeae5626dfd7d8fead4c6254dcb0f11b"} Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.382454 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.882422224 +0000 UTC m=+145.870433696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.382520 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.383710 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.88370214 +0000 UTC m=+145.871713612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.413275 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" event={"ID":"8422f1bf-0f17-43df-8425-93d34601dedd","Type":"ContainerStarted","Data":"771021fc79c7e2ba9a0c30e15d1d9333a822b12a35738480581d611aef5e5f7f"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.413314 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" event={"ID":"8422f1bf-0f17-43df-8425-93d34601dedd","Type":"ContainerStarted","Data":"e8891937d1dc696f365e4f959b4f6e8a612f3e260c4916368bfbc7453d7750f1"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.416003 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" podStartSLOduration=123.415992115 podStartE2EDuration="2m3.415992115s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.414844502 +0000 UTC m=+145.402855974" watchObservedRunningTime="2026-02-19 08:47:23.415992115 +0000 UTC m=+145.404003587" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.431196 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" event={"ID":"9f6fb576-402e-4972-aa09-089865ce389b","Type":"ContainerStarted","Data":"f9bc924a174d65e724226fa643120c9b76784db557222412ba4db0460001ee5b"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.431269 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" event={"ID":"9f6fb576-402e-4972-aa09-089865ce389b","Type":"ContainerStarted","Data":"88a3598bf090f85e4f96b950efb266f9271c3b9648383e700780ac8f2cc48ac6"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.462475 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" podStartSLOduration=123.462455575 podStartE2EDuration="2m3.462455575s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.44340877 +0000 UTC m=+145.431420262" watchObservedRunningTime="2026-02-19 08:47:23.462455575 +0000 UTC m=+145.450467047" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.471685 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hbn8p" event={"ID":"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b","Type":"ContainerStarted","Data":"87629a4314be3030b9d1163d2e9f1b3932604f7589ad7f813cb8674d59b8891f"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.483764 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.484071 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:23.984054754 +0000 UTC m=+145.972066226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.502738 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" event={"ID":"2bebdf2c-5451-43b9-b9fd-dc182d5edde1","Type":"ContainerStarted","Data":"75cb96ff1085a542ea1f462907fe2fb18ceec121275279c7a10332e24999ca8a"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.513043 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bmknd" podStartSLOduration=123.513027033 podStartE2EDuration="2m3.513027033s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.466495521 +0000 UTC m=+145.454506993" watchObservedRunningTime="2026-02-19 08:47:23.513027033 +0000 UTC m=+145.501038505" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.538552 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" event={"ID":"104f8fd7-4fdc-4e3b-8028-30c2630091b6","Type":"ContainerStarted","Data":"4f689cc18d2f512aee8a31b7700c763e8ddf0efab6ce75aa42f11557e5183a15"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.538740 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqbfz" podStartSLOduration=125.538727639 podStartE2EDuration="2m5.538727639s" podCreationTimestamp="2026-02-19 08:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.536543356 +0000 UTC m=+145.524554828" watchObservedRunningTime="2026-02-19 08:47:23.538727639 +0000 UTC m=+145.526739111" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.568494 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" event={"ID":"6826dee1-4dec-4b7c-88a1-600eb014574c","Type":"ContainerStarted","Data":"55aadb5f6cdf03ef728a8d94f8939966b13ea1ce88dc11ff0478e4bbe744bb40"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.587318 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.587761 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.087751142 +0000 UTC m=+146.075762614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.601206 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" event={"ID":"5a3854ab-c63a-4697-975c-e049cc23e0d3","Type":"ContainerStarted","Data":"540969311be663fe4f4aa660c6a4dbbbdcfdf27afb6879c9e7eebb7510fefb84"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.602351 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.616482 4788 csr.go:261] certificate signing request csr-q2b5w is approved, waiting to be issued Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.618935 4788 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sfqbb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.619390 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" podUID="5a3854ab-c63a-4697-975c-e049cc23e0d3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.624501 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" event={"ID":"6e744116-7d2b-4c8d-847b-8b0dc683e2d8","Type":"ContainerStarted","Data":"5d878b8143ccf9455c77d4ff94f307e646812fd849dac7fb5dab69cf3e08f84b"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.624622 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" event={"ID":"6e744116-7d2b-4c8d-847b-8b0dc683e2d8","Type":"ContainerStarted","Data":"11e40049e33ac0b435645139ddd186080061509cc2f4abfaab51db7a9e868ca1"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.626370 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" podStartSLOduration=124.626357568 podStartE2EDuration="2m4.626357568s" podCreationTimestamp="2026-02-19 08:45:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.62573929 +0000 UTC m=+145.613750762" watchObservedRunningTime="2026-02-19 08:47:23.626357568 +0000 UTC m=+145.614369040" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.627208 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5cmw7" podStartSLOduration=123.627202612 podStartE2EDuration="2m3.627202612s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.568359637 +0000 UTC m=+145.556371109" watchObservedRunningTime="2026-02-19 08:47:23.627202612 +0000 UTC m=+145.615214084" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.629678 4788 csr.go:257] certificate signing request csr-q2b5w is issued Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.646479 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z2jtl" event={"ID":"a3eda937-2ad6-4192-bd7f-c04b1697636e","Type":"ContainerStarted","Data":"8e39c0863e17fde40903b312486f9d247b94df02bad363a88d8c46caf1770f6c"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.660656 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-x5fpc" podStartSLOduration=123.660640529 podStartE2EDuration="2m3.660640529s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.650644583 +0000 UTC m=+145.638656065" watchObservedRunningTime="2026-02-19 08:47:23.660640529 +0000 UTC m=+145.648652001" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.667372 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" event={"ID":"9192ab47-cd4b-4d49-916f-e1454d517b52","Type":"ContainerStarted","Data":"5e63f672769edb5b5c523c9e3b9ff46d56906faf415fa7bc59c2b15a367b9bab"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.690133 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.690313 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.190298758 +0000 UTC m=+146.178310230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.690460 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:23 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:23 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:23 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.690521 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.690576 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.691624 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.191612346 +0000 UTC m=+146.179623818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.718075 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" event={"ID":"192b6105-6538-462c-8b1c-3a1a69cea50d","Type":"ContainerStarted","Data":"d59d674a8f76e3deec3dcfa699c494239c7e5a005977c4d247a89beed475872e"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.723977 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" event={"ID":"26d9a486-1abe-4d18-8b80-723c8d25ef89","Type":"ContainerStarted","Data":"73431a6eb667e3b96e082148b066a97183814b6a93e360310997f82303594051"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.724007 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" event={"ID":"26d9a486-1abe-4d18-8b80-723c8d25ef89","Type":"ContainerStarted","Data":"b8bae6a97e15a9fdd5faf2dc2b9b436e23b2156c3735423bc8797d86622b790d"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.734142 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" event={"ID":"d49ed318-47b5-4101-b4b9-09dda3667dd3","Type":"ContainerStarted","Data":"3a5deed482a9014121da04b5bd8f2543b0f5cc6ff8d2f80947599b0c4b9aa3fc"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.782546 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" podStartSLOduration=123.782525819 podStartE2EDuration="2m3.782525819s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.707378397 +0000 UTC m=+145.695389869" watchObservedRunningTime="2026-02-19 08:47:23.782525819 +0000 UTC m=+145.770537291" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.782840 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z2jtl" podStartSLOduration=5.782834938 podStartE2EDuration="5.782834938s" podCreationTimestamp="2026-02-19 08:47:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.780685516 +0000 UTC m=+145.768696988" watchObservedRunningTime="2026-02-19 08:47:23.782834938 +0000 UTC m=+145.770846420" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.822536 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" podStartSLOduration=123.822516694 podStartE2EDuration="2m3.822516694s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.811993463 +0000 UTC m=+145.800004935" watchObservedRunningTime="2026-02-19 08:47:23.822516694 +0000 UTC m=+145.810528166" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.828800 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.836609 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.336589887 +0000 UTC m=+146.324601359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.842265 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" event={"ID":"2ecb616f-62fc-4ff2-a353-6e08c63581a8","Type":"ContainerStarted","Data":"fe5a76b17a19b5965f2bd790279c5a1bc553daf68fa9a14aeb918cda9353244f"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.842458 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pcjq6" podStartSLOduration=123.842439914 podStartE2EDuration="2m3.842439914s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.841627631 +0000 UTC m=+145.829639103" watchObservedRunningTime="2026-02-19 08:47:23.842439914 +0000 UTC m=+145.830451386" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.866658 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" event={"ID":"31c71a84-dec5-44b7-b970-0e7a7cb39a5e","Type":"ContainerStarted","Data":"8bd5611642d65a6af25e32120c8d54784edf2951851d9c0aeeda6a7bb2c3dfa8"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.875649 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.935914 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:23 crc kubenswrapper[4788]: E0219 08:47:23.944617 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.444594889 +0000 UTC m=+146.432606361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.944879 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" event={"ID":"aef03b84-8d54-4cff-a236-20478d45a45d","Type":"ContainerStarted","Data":"dfa9c113e2351791e40c8b2f39f929dc5cc07d0d6d7264d1e3a0948c2cd08f37"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.980985 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" event={"ID":"0ac75663-d01c-4122-b30a-faf65d9a063a","Type":"ContainerStarted","Data":"1c0925c9a69c5c4d350eda97e8d198be6c284282c21d53874b0ab44aad89b176"} Feb 19 08:47:23 crc kubenswrapper[4788]: I0219 08:47:23.981033 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" event={"ID":"0ac75663-d01c-4122-b30a-faf65d9a063a","Type":"ContainerStarted","Data":"3cf3237d68302a5a5a174586e49ef8430c26b99f3432825a01e5caf842abef42"} Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.001485 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" event={"ID":"7ea42bdd-b321-4b32-91d3-0e1cb559ace7","Type":"ContainerStarted","Data":"62b62d3a4378f6fdc0726e4a1cd02d4027b1e0cbd5d96cd8030e9e3a376cac39"} Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.036919 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jmrsd" podStartSLOduration=124.036894702 podStartE2EDuration="2m4.036894702s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:23.890088268 +0000 UTC m=+145.878099750" watchObservedRunningTime="2026-02-19 08:47:24.036894702 +0000 UTC m=+146.024906174" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.037629 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" podStartSLOduration=124.037622612 podStartE2EDuration="2m4.037622612s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:24.023956561 +0000 UTC m=+146.011968033" watchObservedRunningTime="2026-02-19 08:47:24.037622612 +0000 UTC m=+146.025634074" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.038513 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" event={"ID":"3ccbb120-c076-426f-b24b-fe0530b3e056","Type":"ContainerStarted","Data":"8bc5bfcb73cdc78e4d4bf20c7cf386efd02c18248fa6bbf8999a22b582651b90"} Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.043721 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.044447 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.544414987 +0000 UTC m=+146.532426469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.062258 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" event={"ID":"ec358d29-e3c8-4f69-a1bc-7879193b026a","Type":"ContainerStarted","Data":"3ea3eb2582f367c0b65f5e4a29e120dd85431cd43b109dbde5a2a8597437994c"} Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.078518 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" event={"ID":"49e1dd56-37f0-41b8-8afa-d040c5750fac","Type":"ContainerStarted","Data":"156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad"} Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.078564 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" event={"ID":"49e1dd56-37f0-41b8-8afa-d040c5750fac","Type":"ContainerStarted","Data":"578489b10acbd1a479d23c61f019c6f66049ebd3da497f7b5b2147eb21da71af"} Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.079434 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.126040 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" event={"ID":"0c05cbec-c6b7-439f-90b2-589e9068b6eb","Type":"ContainerStarted","Data":"58e56a1c012d7e1ca6918969b7aff128598c48421d5e4744e45c9537e66f653a"} Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.126658 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.149216 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.149544 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.649532487 +0000 UTC m=+146.637543959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.155638 4788 generic.go:334] "Generic (PLEG): container finished" podID="3c317fc6-2b8f-40ea-97a8-03a1da391b8f" containerID="eb061e392e4ea8f34334825fed31c8c7ecb44ba80f3e927377faafd3dbcb766b" exitCode=0 Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.157264 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" event={"ID":"3c317fc6-2b8f-40ea-97a8-03a1da391b8f","Type":"ContainerDied","Data":"eb061e392e4ea8f34334825fed31c8c7ecb44ba80f3e927377faafd3dbcb766b"} Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.195377 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.231701 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p7kds" podStartSLOduration=124.231681999 podStartE2EDuration="2m4.231681999s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:24.1726973 +0000 UTC m=+146.160708802" watchObservedRunningTime="2026-02-19 08:47:24.231681999 +0000 UTC m=+146.219693471" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.250173 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.251174 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.751159516 +0000 UTC m=+146.739170988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.313322 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" podStartSLOduration=124.313303815 podStartE2EDuration="2m4.313303815s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:24.310953628 +0000 UTC m=+146.298965090" watchObservedRunningTime="2026-02-19 08:47:24.313303815 +0000 UTC m=+146.301315287" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.353501 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.354572 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.854555547 +0000 UTC m=+146.842567019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.355509 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" podStartSLOduration=124.355494043 podStartE2EDuration="2m4.355494043s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:24.353161897 +0000 UTC m=+146.341173369" watchObservedRunningTime="2026-02-19 08:47:24.355494043 +0000 UTC m=+146.343505515" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.449619 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bdz4t" podStartSLOduration=124.449603458 podStartE2EDuration="2m4.449603458s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:24.448744763 +0000 UTC m=+146.436756235" watchObservedRunningTime="2026-02-19 08:47:24.449603458 +0000 UTC m=+146.437614930" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.451489 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sm7pt" podStartSLOduration=124.451470911 podStartE2EDuration="2m4.451470911s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:24.379777269 +0000 UTC m=+146.367788741" watchObservedRunningTime="2026-02-19 08:47:24.451470911 +0000 UTC m=+146.439482383" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.455900 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.456200 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:24.956187486 +0000 UTC m=+146.944198958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.556925 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.557365 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.057353893 +0000 UTC m=+147.045365365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.631386 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-19 08:42:23 +0000 UTC, rotation deadline is 2026-11-05 11:49:50.454812053 +0000 UTC Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.631437 4788 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6219h2m25.823378929s for next certificate rotation Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.662815 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.663052 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.163003438 +0000 UTC m=+147.151014910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.663112 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.663617 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.163608975 +0000 UTC m=+147.151620447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.689480 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:24 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:24 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:24 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.689865 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.765405 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.765970 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.265948705 +0000 UTC m=+147.253960177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.867361 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.868155 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.368143141 +0000 UTC m=+147.356154613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:24 crc kubenswrapper[4788]: I0219 08:47:24.968772 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:24 crc kubenswrapper[4788]: E0219 08:47:24.969061 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.46904565 +0000 UTC m=+147.457057122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.070899 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.071305 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.571287377 +0000 UTC m=+147.559298849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.079801 4788 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xw477 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.079851 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" podUID="49e1dd56-37f0-41b8-8afa-d040c5750fac" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.16:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.172912 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.173474 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.673457832 +0000 UTC m=+147.661469304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.178667 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" event={"ID":"9192ab47-cd4b-4d49-916f-e1454d517b52","Type":"ContainerStarted","Data":"624fb985a77f0227e541f20be73185c62d3042c1eb0ee926df478bcbbd0e7f6f"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.178881 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-fxs8h" event={"ID":"9192ab47-cd4b-4d49-916f-e1454d517b52","Type":"ContainerStarted","Data":"841366acf7c6994a6b447ee42a3372c52c9d6ace43d3a3fd34cc1f951154239b"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.185029 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" event={"ID":"3b09b21d-9a25-46cf-92c9-a1e427c068c6","Type":"ContainerStarted","Data":"05693eb992983e1378c0a7fec5cf1a85168809f4ec042464203f6c1505cc5cc0"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.188109 4788 generic.go:334] "Generic (PLEG): container finished" podID="9f6fb576-402e-4972-aa09-089865ce389b" containerID="f9bc924a174d65e724226fa643120c9b76784db557222412ba4db0460001ee5b" exitCode=0 Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.188158 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" event={"ID":"9f6fb576-402e-4972-aa09-089865ce389b","Type":"ContainerDied","Data":"f9bc924a174d65e724226fa643120c9b76784db557222412ba4db0460001ee5b"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.188176 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" event={"ID":"9f6fb576-402e-4972-aa09-089865ce389b","Type":"ContainerStarted","Data":"782da174e4c0e3275ba513309ad538d6f1d636ac2e414cda37d64943cd239c36"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.188851 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.199660 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" event={"ID":"039e68bf-13c6-484b-a4db-229d6d6b5886","Type":"ContainerStarted","Data":"cb1d6b4d192b3f01f8fbe4d543698d05ca41c6b40f1a434b72cd683e6626e5d0"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.199704 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" event={"ID":"039e68bf-13c6-484b-a4db-229d6d6b5886","Type":"ContainerStarted","Data":"58e2190d04005bdd08f42fb6861309b8aa8edd7ee6d07393f9cb0dba43052c5d"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.202774 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" event={"ID":"2b012062-484f-4ad8-99c5-37425cb6e3e1","Type":"ContainerStarted","Data":"517080b0a4467b19e148b0a16ef1c2bc474f7f852837fc4dbbecf149ba567914"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.208566 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" event={"ID":"192b6105-6538-462c-8b1c-3a1a69cea50d","Type":"ContainerStarted","Data":"4ee823c0035bec77499cecab62bba4c133cef53254cc0f79c11bce895eb18c30"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.212037 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" event={"ID":"ec358d29-e3c8-4f69-a1bc-7879193b026a","Type":"ContainerStarted","Data":"b2ae4c0bc2302c6c2863384d7575e4911255dbc3ddbb12d60aa5bdf8b7f4ae3e"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.218015 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" event={"ID":"85f68684-96fb-43fd-bdd0-384451fb1a58","Type":"ContainerStarted","Data":"5d96e5f18e4a6e7794cba001205f991c14d412d73ed1637732dcc0327df3607d"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.218360 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.221313 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-s6bkf" event={"ID":"55518a10-e84e-48fa-bc48-fb357a94f6ea","Type":"ContainerStarted","Data":"83e6158039224dcfcb4ec1e1454cb4dd4f291a680cac1bee940ce03c097a91d5"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.245356 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" event={"ID":"0c05cbec-c6b7-439f-90b2-589e9068b6eb","Type":"ContainerStarted","Data":"41be647505597071aa603860ea3bd73f8daba7e3225576f020931413eb4e72a7"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.245400 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" event={"ID":"0c05cbec-c6b7-439f-90b2-589e9068b6eb","Type":"ContainerStarted","Data":"b036effde94b60475407c6bb967b0c698dbc8de176ffb5dfc19f67e0283f103e"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.252298 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.262664 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" event={"ID":"6826dee1-4dec-4b7c-88a1-600eb014574c","Type":"ContainerStarted","Data":"358449ecb2f9cdb5274a35d672b21a1e0a91b14ff53f4bd28527d24009a3d03e"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.275352 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" podStartSLOduration=125.275333659 podStartE2EDuration="2m5.275333659s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:25.234952883 +0000 UTC m=+147.222964355" watchObservedRunningTime="2026-02-19 08:47:25.275333659 +0000 UTC m=+147.263345131" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.276322 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-zq4ss" podStartSLOduration=125.276316947 podStartE2EDuration="2m5.276316947s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:25.276090991 +0000 UTC m=+147.264102463" watchObservedRunningTime="2026-02-19 08:47:25.276316947 +0000 UTC m=+147.264328419" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.276961 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.283563 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.783550924 +0000 UTC m=+147.771562396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.284565 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" event={"ID":"3c317fc6-2b8f-40ea-97a8-03a1da391b8f","Type":"ContainerStarted","Data":"230c01d5f94f6b691e76237cf6d6a74cadbd5a046ddb215ea5b563d5c02a6b26"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.304415 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" event={"ID":"6e569289-11c5-4577-92e6-c8031eda90ec","Type":"ContainerStarted","Data":"e370d2949ac2b8ae71a422b8c167206da53ca989fb31d3facc7d8ce1acfca9cc"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.314007 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-q56wb" podStartSLOduration=125.313993606 podStartE2EDuration="2m5.313993606s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:25.313129661 +0000 UTC m=+147.301141133" watchObservedRunningTime="2026-02-19 08:47:25.313993606 +0000 UTC m=+147.302005078" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.317400 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" event={"ID":"aef03b84-8d54-4cff-a236-20478d45a45d","Type":"ContainerStarted","Data":"08352c69b706da52d3ff30a208446bed7b17b773fe30560e4a549cf423005f2e"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.317444 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" event={"ID":"aef03b84-8d54-4cff-a236-20478d45a45d","Type":"ContainerStarted","Data":"5ed6acd4d7fed0df2ebada74ef23af8612b7debeb22aec1ab827e81847bbb235"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.320921 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z2jtl" event={"ID":"a3eda937-2ad6-4192-bd7f-c04b1697636e","Type":"ContainerStarted","Data":"ff69e5eb56df5c64e9e65d172fc4419413953f9ee865841bc928989ad1648dba"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.330260 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" event={"ID":"e285d56c-2894-48f6-987d-217b4efd8f6e","Type":"ContainerStarted","Data":"29075d32721dcd16ed393c02a4f96be19bac357fbd9795e8ad4a455d1512288e"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.330296 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" event={"ID":"e285d56c-2894-48f6-987d-217b4efd8f6e","Type":"ContainerStarted","Data":"7264d6698c1d1a5e72ef8ceef0888b8ea26c52b3efc0a3d8908acf19bb5d71ef"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.343284 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hbn8p" event={"ID":"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b","Type":"ContainerStarted","Data":"4fb486f4f920b99ddeba7b09475eb6af016077a22cb3fc3de9c5a03b6fcc3380"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.343313 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hbn8p" event={"ID":"09b2e980-c7fc-4ae4-8e70-c8539c1a4d3b","Type":"ContainerStarted","Data":"8efe5c5a942daa78ad4fac5b6c713f8f63c161b6b9ff2d8947719adc09076e3a"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.343687 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.358950 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-9s2zg" event={"ID":"2ecb616f-62fc-4ff2-a353-6e08c63581a8","Type":"ContainerStarted","Data":"ad7907b63f8f9f5c5834d5166c30e3a00b84479219d37eb0c0789106e444c82d"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.377907 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.378614 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.878588945 +0000 UTC m=+147.866600417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.379934 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.380786 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.381231 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.88121477 +0000 UTC m=+147.869226242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.382514 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" event={"ID":"7ea42bdd-b321-4b32-91d3-0e1cb559ace7","Type":"ContainerStarted","Data":"b029ef99cae0e1ee3cea358e9472435d9b1e48f93d035c43a45027fbb9cd2cce"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.382622 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" event={"ID":"7ea42bdd-b321-4b32-91d3-0e1cb559ace7","Type":"ContainerStarted","Data":"ffb5d2b3e3376d37244b725eb267e1a078ece26cb3ab15c928387f1c2c22ae80"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.383488 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.385079 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" event={"ID":"f4b40324-2628-4351-a48f-c607b5a19114","Type":"ContainerStarted","Data":"58ece7acfded87aa187de4be5bab5b67c5a37862de44a6417cc679b9ac7ff9cf"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.385110 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" event={"ID":"f4b40324-2628-4351-a48f-c607b5a19114","Type":"ContainerStarted","Data":"601aca42d33e44a36c0e32fda8ffa399fc7755c94d715df5528fbf0a460110db"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.388815 4788 patch_prober.go:28] interesting pod/apiserver-76f77b778f-qgw8f container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.388854 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" podUID="039e68bf-13c6-484b-a4db-229d6d6b5886" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.412332 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.414818 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.417971 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" event={"ID":"5a3854ab-c63a-4697-975c-e049cc23e0d3","Type":"ContainerStarted","Data":"e545e90e0643a0fb41f08e0bbcdd6f57986a9290641586c1cb95bbbcc0bd692f"} Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.418067 4788 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-bpl9k container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.418103 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" podUID="3c317fc6-2b8f-40ea-97a8-03a1da391b8f" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.419663 4788 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jdjtx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.419697 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" podUID="90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.423654 4788 patch_prober.go:28] interesting pod/downloads-7954f5f757-h7qcn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.423681 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h7qcn" podUID="b590f5cb-d3a6-43b9-97ef-29ad515ecbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.434444 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-brvcp" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.461683 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sfqbb" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.484777 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.486141 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:25.986126714 +0000 UTC m=+147.974138176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.551462 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" podStartSLOduration=125.551445794 podStartE2EDuration="2m5.551445794s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:25.377525505 +0000 UTC m=+147.365536977" watchObservedRunningTime="2026-02-19 08:47:25.551445794 +0000 UTC m=+147.539457266" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.592380 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.593169 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.093149548 +0000 UTC m=+148.081161020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.684647 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:25 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:25 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:25 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.684727 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.694208 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.694453 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.194401257 +0000 UTC m=+148.182412749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.694549 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.694921 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.194905071 +0000 UTC m=+148.182916543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.771058 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfw9m" podStartSLOduration=125.771039601 podStartE2EDuration="2m5.771039601s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:25.752551572 +0000 UTC m=+147.740563044" watchObservedRunningTime="2026-02-19 08:47:25.771039601 +0000 UTC m=+147.759051073" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.771808 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4mmpf" podStartSLOduration=125.771803103 podStartE2EDuration="2m5.771803103s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:25.553571375 +0000 UTC m=+147.541582847" watchObservedRunningTime="2026-02-19 08:47:25.771803103 +0000 UTC m=+147.759814565" Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.795998 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.796228 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.296186161 +0000 UTC m=+148.284197633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.796314 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.796724 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.296706216 +0000 UTC m=+148.284717688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:25 crc kubenswrapper[4788]: I0219 08:47:25.898119 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:25 crc kubenswrapper[4788]: E0219 08:47:25.898715 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.398692366 +0000 UTC m=+148.386703838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.001156 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.003682 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.503664621 +0000 UTC m=+148.491676093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.053234 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-zhsg9" podStartSLOduration=126.05320416 podStartE2EDuration="2m6.05320416s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:26.051684756 +0000 UTC m=+148.039696238" watchObservedRunningTime="2026-02-19 08:47:26.05320416 +0000 UTC m=+148.041215632" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.092678 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.106742 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.107739 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.607723671 +0000 UTC m=+148.595735133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.123296 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-n2kj2" podStartSLOduration=126.123279326 podStartE2EDuration="2m6.123279326s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:26.083754935 +0000 UTC m=+148.071766407" watchObservedRunningTime="2026-02-19 08:47:26.123279326 +0000 UTC m=+148.111290798" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.125007 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qmnq5" podStartSLOduration=126.125002756 podStartE2EDuration="2m6.125002756s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:26.122587256 +0000 UTC m=+148.110598728" watchObservedRunningTime="2026-02-19 08:47:26.125002756 +0000 UTC m=+148.113014228" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.186376 4788 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-ltdd8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.186812 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" podUID="3b09b21d-9a25-46cf-92c9-a1e427c068c6" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.209165 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.209606 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.709594438 +0000 UTC m=+148.697605910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.223079 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-kq7pj" podStartSLOduration=126.223061123 podStartE2EDuration="2m6.223061123s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:26.158229397 +0000 UTC m=+148.146240869" watchObservedRunningTime="2026-02-19 08:47:26.223061123 +0000 UTC m=+148.211072595" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.223498 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" podStartSLOduration=126.223491806 podStartE2EDuration="2m6.223491806s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:26.223215348 +0000 UTC m=+148.211226820" watchObservedRunningTime="2026-02-19 08:47:26.223491806 +0000 UTC m=+148.211503278" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.265419 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hbn8p" podStartSLOduration=8.265403134 podStartE2EDuration="8.265403134s" podCreationTimestamp="2026-02-19 08:47:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:26.263770488 +0000 UTC m=+148.251781960" watchObservedRunningTime="2026-02-19 08:47:26.265403134 +0000 UTC m=+148.253414606" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.313954 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.314138 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.814114309 +0000 UTC m=+148.802125781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.314450 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.314757 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.814749757 +0000 UTC m=+148.802761229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.415340 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.415529 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.915501262 +0000 UTC m=+148.903512744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.415645 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.416033 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:26.916016497 +0000 UTC m=+148.904027969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.454324 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" event={"ID":"ec358d29-e3c8-4f69-a1bc-7879193b026a","Type":"ContainerStarted","Data":"b2862233a5dbe39e6dd2e3ac6ecced929849d4553eb379ec723e694371e63dbe"} Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.454534 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" event={"ID":"ec358d29-e3c8-4f69-a1bc-7879193b026a","Type":"ContainerStarted","Data":"cb5ff9076182beac5c4ae2f03105abdb525665c00aa74880fc4cbb3bbf9480e0"} Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.456875 4788 patch_prober.go:28] interesting pod/downloads-7954f5f757-h7qcn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.456936 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h7qcn" podUID="b590f5cb-d3a6-43b9-97ef-29ad515ecbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.476436 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.519659 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.521668 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.021651951 +0000 UTC m=+149.009663423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.547220 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-ltdd8" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.624974 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.625039 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.625081 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.625105 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.625129 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.629640 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.129623962 +0000 UTC m=+149.117635434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.631315 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.634731 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.638005 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.645240 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.690437 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:26 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:26 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:26 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.690489 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.727483 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.727835 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.227821104 +0000 UTC m=+149.215832576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.754378 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.770453 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.781512 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.830021 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.830352 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.330341009 +0000 UTC m=+149.318352481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.931217 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:26 crc kubenswrapper[4788]: E0219 08:47:26.932380 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.43235883 +0000 UTC m=+149.420370302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.986768 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wmfkx"] Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.987900 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:26 crc kubenswrapper[4788]: I0219 08:47:26.991762 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.012977 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmfkx"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.033575 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-utilities\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.033629 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-catalog-content\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.033667 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cj29\" (UniqueName: \"kubernetes.io/projected/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-kube-api-access-2cj29\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.033772 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.034081 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.534066542 +0000 UTC m=+149.522078014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.134504 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.134637 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.634612311 +0000 UTC m=+149.622623783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.134873 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.134943 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-utilities\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.134970 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-catalog-content\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.134997 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cj29\" (UniqueName: \"kubernetes.io/projected/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-kube-api-access-2cj29\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.135135 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.635127835 +0000 UTC m=+149.623139307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.135899 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-utilities\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.135909 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-catalog-content\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.169096 4788 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.186063 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cj29\" (UniqueName: \"kubernetes.io/projected/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-kube-api-access-2cj29\") pod \"community-operators-wmfkx\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.199171 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-swpvd"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.204970 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.212053 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.236709 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.236938 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-utilities\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.236986 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6tvr\" (UniqueName: \"kubernetes.io/projected/f727c8c6-b0d5-470e-bd9a-593b908dbef4-kube-api-access-l6tvr\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.237019 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-catalog-content\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.237128 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.737112815 +0000 UTC m=+149.725124287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.239323 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swpvd"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.335882 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.337686 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6tvr\" (UniqueName: \"kubernetes.io/projected/f727c8c6-b0d5-470e-bd9a-593b908dbef4-kube-api-access-l6tvr\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.337737 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-catalog-content\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.337813 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-utilities\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.337841 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.338117 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.838105137 +0000 UTC m=+149.826116609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.339037 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-catalog-content\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.339268 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-utilities\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.369044 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6tvr\" (UniqueName: \"kubernetes.io/projected/f727c8c6-b0d5-470e-bd9a-593b908dbef4-kube-api-access-l6tvr\") pod \"certified-operators-swpvd\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.381777 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5blbm"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.394744 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.397254 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5blbm"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.439455 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.439737 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-utilities\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.439774 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wbls\" (UniqueName: \"kubernetes.io/projected/31ee37e0-613d-48b1-8e38-e8cc4608ee14-kube-api-access-4wbls\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.439997 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-catalog-content\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.440308 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:27.940280612 +0000 UTC m=+149.928292084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.502460 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4ea2cc5407e988ffcde661df9560714dc6518320fcc4a6411d085d8a41c220e3"} Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.518407 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c759c813c631a3525591ad94073ec85c3e2b13e1e0f6831d3669168126739de0"} Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.520692 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" event={"ID":"ec358d29-e3c8-4f69-a1bc-7879193b026a","Type":"ContainerStarted","Data":"56530790766dbcfe83688eea636d23da1bbe3d9dcce517ea097476deaa6a9113"} Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.522945 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"341fae91a461ff3397d0091fef0b4f9568c39b1d934c0dccae4b8acf6b17abcd"} Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.529910 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-8vngm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.543935 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-utilities\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.543974 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wbls\" (UniqueName: \"kubernetes.io/projected/31ee37e0-613d-48b1-8e38-e8cc4608ee14-kube-api-access-4wbls\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.544091 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-catalog-content\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.544203 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.547563 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-catalog-content\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.547565 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:28.047552383 +0000 UTC m=+150.035563855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.554015 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-utilities\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.560378 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9c2k4" podStartSLOduration=10.56035977 podStartE2EDuration="10.56035977s" podCreationTimestamp="2026-02-19 08:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:27.55336472 +0000 UTC m=+149.541376192" watchObservedRunningTime="2026-02-19 08:47:27.56035977 +0000 UTC m=+149.548371242" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.590835 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wbls\" (UniqueName: \"kubernetes.io/projected/31ee37e0-613d-48b1-8e38-e8cc4608ee14-kube-api-access-4wbls\") pod \"community-operators-5blbm\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.594540 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.606940 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dlmp7"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.613390 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.622130 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlmp7"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.645160 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.645351 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:28.145328853 +0000 UTC m=+150.133340325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.645420 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-catalog-content\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.645482 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.645507 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddld7\" (UniqueName: \"kubernetes.io/projected/218f4f15-8940-4e61-b94f-9967289b9846-kube-api-access-ddld7\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.645525 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-utilities\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.645840 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:28.145833287 +0000 UTC m=+150.133844759 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.682502 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:27 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:27 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:27 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.682554 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.722732 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.746185 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.746349 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-catalog-content\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.746415 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddld7\" (UniqueName: \"kubernetes.io/projected/218f4f15-8940-4e61-b94f-9967289b9846-kube-api-access-ddld7\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.746434 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-utilities\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.746799 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-utilities\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.746882 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-19 08:47:28.24686438 +0000 UTC m=+150.234875852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.747141 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-catalog-content\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.776928 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddld7\" (UniqueName: \"kubernetes.io/projected/218f4f15-8940-4e61-b94f-9967289b9846-kube-api-access-ddld7\") pod \"certified-operators-dlmp7\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.778428 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wmfkx"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.847840 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:27 crc kubenswrapper[4788]: E0219 08:47:27.848321 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-19 08:47:28.348308434 +0000 UTC m=+150.336319906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-pj5df" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.879163 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swpvd"] Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.911630 4788 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-19T08:47:27.169127359Z","Handler":null,"Name":""} Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.920434 4788 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.920467 4788 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.963810 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.964427 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:47:27 crc kubenswrapper[4788]: I0219 08:47:27.974491 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.033799 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5blbm"] Feb 19 08:47:28 crc kubenswrapper[4788]: W0219 08:47:28.056973 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31ee37e0_613d_48b1_8e38_e8cc4608ee14.slice/crio-14b91294edc4866aaf09a102b0f887affbc1bc41f7f48726dfe45e54bb5b1116 WatchSource:0}: Error finding container 14b91294edc4866aaf09a102b0f887affbc1bc41f7f48726dfe45e54bb5b1116: Status 404 returned error can't find the container with id 14b91294edc4866aaf09a102b0f887affbc1bc41f7f48726dfe45e54bb5b1116 Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.065129 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.070376 4788 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.070412 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.117050 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-pj5df\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.182214 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlmp7"] Feb 19 08:47:28 crc kubenswrapper[4788]: W0219 08:47:28.258133 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod218f4f15_8940_4e61_b94f_9967289b9846.slice/crio-7fcc40e95e4424faaa199789e7e493dee097b4200e5844861960f4496fe99995 WatchSource:0}: Error finding container 7fcc40e95e4424faaa199789e7e493dee097b4200e5844861960f4496fe99995: Status 404 returned error can't find the container with id 7fcc40e95e4424faaa199789e7e493dee097b4200e5844861960f4496fe99995 Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.417978 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.532523 4788 generic.go:334] "Generic (PLEG): container finished" podID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerID="85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf" exitCode=0 Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.532612 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swpvd" event={"ID":"f727c8c6-b0d5-470e-bd9a-593b908dbef4","Type":"ContainerDied","Data":"85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.532649 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swpvd" event={"ID":"f727c8c6-b0d5-470e-bd9a-593b908dbef4","Type":"ContainerStarted","Data":"ea43d555964b365165ec2df291041b39b6f22a8cf71c39b13d941dbdafbbe563"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.534274 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.539566 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"435960f115eac39d849236e4d8cbd9d652302b81e24a524b4af0993ddcfd43f0"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.539641 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.558725 4788 generic.go:334] "Generic (PLEG): container finished" podID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerID="58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30" exitCode=0 Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.558812 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmfkx" event={"ID":"c3846ca6-3c9c-4f02-978e-bee6148e0ba7","Type":"ContainerDied","Data":"58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.558841 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmfkx" event={"ID":"c3846ca6-3c9c-4f02-978e-bee6148e0ba7","Type":"ContainerStarted","Data":"d4e36ac0cee73007b853f0c1c304199f2fcbea25fe5d1f20d0b2bdd88f73bce2"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.568107 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e5e22e097fad5cf2ca17d9aab33f4916eea15c189e6f3a0981988952e3424436"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.590431 4788 generic.go:334] "Generic (PLEG): container finished" podID="6826dee1-4dec-4b7c-88a1-600eb014574c" containerID="358449ecb2f9cdb5274a35d672b21a1e0a91b14ff53f4bd28527d24009a3d03e" exitCode=0 Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.590506 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" event={"ID":"6826dee1-4dec-4b7c-88a1-600eb014574c","Type":"ContainerDied","Data":"358449ecb2f9cdb5274a35d672b21a1e0a91b14ff53f4bd28527d24009a3d03e"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.600857 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"284521a97ab73cfa298f81419409cbd556af2118e2a0cbfdf367f42f519061ad"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.606174 4788 generic.go:334] "Generic (PLEG): container finished" podID="218f4f15-8940-4e61-b94f-9967289b9846" containerID="af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8" exitCode=0 Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.606276 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlmp7" event={"ID":"218f4f15-8940-4e61-b94f-9967289b9846","Type":"ContainerDied","Data":"af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.606302 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlmp7" event={"ID":"218f4f15-8940-4e61-b94f-9967289b9846","Type":"ContainerStarted","Data":"7fcc40e95e4424faaa199789e7e493dee097b4200e5844861960f4496fe99995"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.608788 4788 generic.go:334] "Generic (PLEG): container finished" podID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerID="f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7" exitCode=0 Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.608852 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5blbm" event={"ID":"31ee37e0-613d-48b1-8e38-e8cc4608ee14","Type":"ContainerDied","Data":"f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.608878 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5blbm" event={"ID":"31ee37e0-613d-48b1-8e38-e8cc4608ee14","Type":"ContainerStarted","Data":"14b91294edc4866aaf09a102b0f887affbc1bc41f7f48726dfe45e54bb5b1116"} Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.669270 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pj5df"] Feb 19 08:47:28 crc kubenswrapper[4788]: W0219 08:47:28.694665 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89374177_14ef_4b9a_938a_a838d6d0aab1.slice/crio-fa782e474278ecdf926d593727c42416fcc76e0e96c5e2c9859d654415dd801c WatchSource:0}: Error finding container fa782e474278ecdf926d593727c42416fcc76e0e96c5e2c9859d654415dd801c: Status 404 returned error can't find the container with id fa782e474278ecdf926d593727c42416fcc76e0e96c5e2c9859d654415dd801c Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.694825 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:28 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:28 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:28 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.694908 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:28 crc kubenswrapper[4788]: I0219 08:47:28.733186 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.182513 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s2fqp"] Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.183948 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.187458 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.194171 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2fqp"] Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.284959 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn6bl\" (UniqueName: \"kubernetes.io/projected/85dfd540-d029-4a79-a997-3f2f3796b7b1-kube-api-access-jn6bl\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.285109 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-catalog-content\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.285143 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-utilities\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.386025 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn6bl\" (UniqueName: \"kubernetes.io/projected/85dfd540-d029-4a79-a997-3f2f3796b7b1-kube-api-access-jn6bl\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.386092 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-catalog-content\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.386109 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-utilities\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.387419 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-utilities\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.387947 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-catalog-content\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.405464 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn6bl\" (UniqueName: \"kubernetes.io/projected/85dfd540-d029-4a79-a997-3f2f3796b7b1-kube-api-access-jn6bl\") pod \"redhat-marketplace-s2fqp\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.489307 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.490089 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.492902 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.493098 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.498391 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.538998 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.585314 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkp5"] Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.586323 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.606507 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkp5"] Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.645722 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" event={"ID":"89374177-14ef-4b9a-938a-a838d6d0aab1","Type":"ContainerStarted","Data":"88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322"} Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.645772 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" event={"ID":"89374177-14ef-4b9a-938a-a838d6d0aab1","Type":"ContainerStarted","Data":"fa782e474278ecdf926d593727c42416fcc76e0e96c5e2c9859d654415dd801c"} Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.646211 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.672867 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" podStartSLOduration=129.672852682 podStartE2EDuration="2m9.672852682s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:29.661864467 +0000 UTC m=+151.649875949" watchObservedRunningTime="2026-02-19 08:47:29.672852682 +0000 UTC m=+151.660864154" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.684151 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:29 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:29 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:29 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.684390 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.692004 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v7kn\" (UniqueName: \"kubernetes.io/projected/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-kube-api-access-2v7kn\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.692190 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-catalog-content\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.692216 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-utilities\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.692271 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.692286 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.792867 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.792957 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.793066 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v7kn\" (UniqueName: \"kubernetes.io/projected/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-kube-api-access-2v7kn\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.793096 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-catalog-content\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.793332 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-utilities\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.794625 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-catalog-content\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.795002 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-utilities\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.795041 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.795156 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2fqp"] Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.820306 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v7kn\" (UniqueName: \"kubernetes.io/projected/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-kube-api-access-2v7kn\") pod \"redhat-marketplace-fhkp5\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.832071 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.904513 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:29 crc kubenswrapper[4788]: I0219 08:47:29.944736 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.099336 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6826dee1-4dec-4b7c-88a1-600eb014574c-secret-volume\") pod \"6826dee1-4dec-4b7c-88a1-600eb014574c\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.099774 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6826dee1-4dec-4b7c-88a1-600eb014574c-config-volume\") pod \"6826dee1-4dec-4b7c-88a1-600eb014574c\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.099867 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjfgr\" (UniqueName: \"kubernetes.io/projected/6826dee1-4dec-4b7c-88a1-600eb014574c-kube-api-access-bjfgr\") pod \"6826dee1-4dec-4b7c-88a1-600eb014574c\" (UID: \"6826dee1-4dec-4b7c-88a1-600eb014574c\") " Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.102350 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6826dee1-4dec-4b7c-88a1-600eb014574c-config-volume" (OuterVolumeSpecName: "config-volume") pod "6826dee1-4dec-4b7c-88a1-600eb014574c" (UID: "6826dee1-4dec-4b7c-88a1-600eb014574c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.116783 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.117718 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6826dee1-4dec-4b7c-88a1-600eb014574c-kube-api-access-bjfgr" (OuterVolumeSpecName: "kube-api-access-bjfgr") pod "6826dee1-4dec-4b7c-88a1-600eb014574c" (UID: "6826dee1-4dec-4b7c-88a1-600eb014574c"). InnerVolumeSpecName "kube-api-access-bjfgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.118109 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6826dee1-4dec-4b7c-88a1-600eb014574c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6826dee1-4dec-4b7c-88a1-600eb014574c" (UID: "6826dee1-4dec-4b7c-88a1-600eb014574c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.184422 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fx2sh"] Feb 19 08:47:30 crc kubenswrapper[4788]: E0219 08:47:30.184616 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6826dee1-4dec-4b7c-88a1-600eb014574c" containerName="collect-profiles" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.184628 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="6826dee1-4dec-4b7c-88a1-600eb014574c" containerName="collect-profiles" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.184730 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="6826dee1-4dec-4b7c-88a1-600eb014574c" containerName="collect-profiles" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.185396 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.188034 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.201897 4788 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6826dee1-4dec-4b7c-88a1-600eb014574c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.201919 4788 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6826dee1-4dec-4b7c-88a1-600eb014574c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.201931 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjfgr\" (UniqueName: \"kubernetes.io/projected/6826dee1-4dec-4b7c-88a1-600eb014574c-kube-api-access-bjfgr\") on node \"crc\" DevicePath \"\"" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.206679 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fx2sh"] Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.279897 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.279928 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.288696 4788 patch_prober.go:28] interesting pod/console-f9d7485db-jr6pt container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.288744 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jr6pt" podUID="7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.303428 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-catalog-content\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.303514 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bxqf\" (UniqueName: \"kubernetes.io/projected/eaaee2a4-db49-437f-a87b-98beb5e66e91-kube-api-access-4bxqf\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.303559 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-utilities\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.393352 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.401811 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qgw8f" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.406876 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-catalog-content\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.407102 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bxqf\" (UniqueName: \"kubernetes.io/projected/eaaee2a4-db49-437f-a87b-98beb5e66e91-kube-api-access-4bxqf\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.407209 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-utilities\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.408301 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-utilities\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.408499 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-catalog-content\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.426581 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.440657 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bpl9k" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.442706 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bxqf\" (UniqueName: \"kubernetes.io/projected/eaaee2a4-db49-437f-a87b-98beb5e66e91-kube-api-access-4bxqf\") pod \"redhat-operators-fx2sh\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.509372 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.519974 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.561947 4788 patch_prober.go:28] interesting pod/downloads-7954f5f757-h7qcn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.561993 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h7qcn" podUID="b590f5cb-d3a6-43b9-97ef-29ad515ecbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.562030 4788 patch_prober.go:28] interesting pod/downloads-7954f5f757-h7qcn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.562084 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-h7qcn" podUID="b590f5cb-d3a6-43b9-97ef-29ad515ecbc9" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.592976 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkp5"] Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.611628 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bjnxw"] Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.613003 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.627760 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjnxw"] Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.679373 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.687495 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6","Type":"ContainerStarted","Data":"d92f85f3b6f2788c2f570794f631d937a79f3ae308a0657b5e23a8c5d8cff148"} Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.691434 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:30 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:30 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:30 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.691487 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.709822 4788 generic.go:334] "Generic (PLEG): container finished" podID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerID="ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d" exitCode=0 Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.710076 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2fqp" event={"ID":"85dfd540-d029-4a79-a997-3f2f3796b7b1","Type":"ContainerDied","Data":"ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d"} Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.710113 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2fqp" event={"ID":"85dfd540-d029-4a79-a997-3f2f3796b7b1","Type":"ContainerStarted","Data":"504663f4a4a18e535cf7331a360a29e3a7e3ea772de315804a8f78b8d8288857"} Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.711920 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-utilities\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.711955 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-catalog-content\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.711975 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gspsg\" (UniqueName: \"kubernetes.io/projected/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-kube-api-access-gspsg\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.764237 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.771653 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq" event={"ID":"6826dee1-4dec-4b7c-88a1-600eb014574c","Type":"ContainerDied","Data":"55aadb5f6cdf03ef728a8d94f8939966b13ea1ce88dc11ff0478e4bbe744bb40"} Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.771714 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55aadb5f6cdf03ef728a8d94f8939966b13ea1ce88dc11ff0478e4bbe744bb40" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.771725 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkp5" event={"ID":"4ef38e88-8e5c-4b56-8123-a60a3eded0a7","Type":"ContainerStarted","Data":"af99c4e7cb72bb6a0cf8a83d5cde041d5ddf987bfaae703840c4c340d427c286"} Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.813020 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-utilities\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.813065 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-catalog-content\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.813080 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gspsg\" (UniqueName: \"kubernetes.io/projected/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-kube-api-access-gspsg\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.815292 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-utilities\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.815414 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-catalog-content\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.844176 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gspsg\" (UniqueName: \"kubernetes.io/projected/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-kube-api-access-gspsg\") pod \"redhat-operators-bjnxw\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:30 crc kubenswrapper[4788]: I0219 08:47:30.959912 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.193341 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fx2sh"] Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.495546 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bjnxw"] Feb 19 08:47:31 crc kubenswrapper[4788]: W0219 08:47:31.508662 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc056bee7_9698_4107_b2e8_dd9c2b3eb6a6.slice/crio-64358431445cdffd7cdf19876997ec4315aeef97b97afb98256d094b58c12287 WatchSource:0}: Error finding container 64358431445cdffd7cdf19876997ec4315aeef97b97afb98256d094b58c12287: Status 404 returned error can't find the container with id 64358431445cdffd7cdf19876997ec4315aeef97b97afb98256d094b58c12287 Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.682907 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:31 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:31 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:31 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.682975 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.795800 4788 generic.go:334] "Generic (PLEG): container finished" podID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerID="1ffe3a55100beeac8f126fdabb7d81205c0e293a125f022ff70a37ce71477e0a" exitCode=0 Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.795871 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjnxw" event={"ID":"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6","Type":"ContainerDied","Data":"1ffe3a55100beeac8f126fdabb7d81205c0e293a125f022ff70a37ce71477e0a"} Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.796186 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjnxw" event={"ID":"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6","Type":"ContainerStarted","Data":"64358431445cdffd7cdf19876997ec4315aeef97b97afb98256d094b58c12287"} Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.801325 4788 generic.go:334] "Generic (PLEG): container finished" podID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerID="af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d" exitCode=0 Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.801376 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx2sh" event={"ID":"eaaee2a4-db49-437f-a87b-98beb5e66e91","Type":"ContainerDied","Data":"af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d"} Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.801392 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx2sh" event={"ID":"eaaee2a4-db49-437f-a87b-98beb5e66e91","Type":"ContainerStarted","Data":"3228cb949f6837237b48ba9f037c959354c8150f23fc39a74a223bc0ce09c50e"} Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.819683 4788 generic.go:334] "Generic (PLEG): container finished" podID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerID="5594757e59dde3811fa1cd268c9064df5e239699aaca31565a236c8175af6b5e" exitCode=0 Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.819781 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkp5" event={"ID":"4ef38e88-8e5c-4b56-8123-a60a3eded0a7","Type":"ContainerDied","Data":"5594757e59dde3811fa1cd268c9064df5e239699aaca31565a236c8175af6b5e"} Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.830041 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6","Type":"ContainerStarted","Data":"395ae55b0796295c9c008d05ef7f24f2fce93a8d505893a32fcd7570162e4153"} Feb 19 08:47:31 crc kubenswrapper[4788]: I0219 08:47:31.878189 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.878173781 podStartE2EDuration="2.878173781s" podCreationTimestamp="2026-02-19 08:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:31.874379382 +0000 UTC m=+153.862390854" watchObservedRunningTime="2026-02-19 08:47:31.878173781 +0000 UTC m=+153.866185253" Feb 19 08:47:32 crc kubenswrapper[4788]: I0219 08:47:32.682472 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:32 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:32 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:32 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:32 crc kubenswrapper[4788]: I0219 08:47:32.682737 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:32 crc kubenswrapper[4788]: I0219 08:47:32.865108 4788 generic.go:334] "Generic (PLEG): container finished" podID="a2171a56-7a4c-4bde-8d04-a1510d4e4ed6" containerID="395ae55b0796295c9c008d05ef7f24f2fce93a8d505893a32fcd7570162e4153" exitCode=0 Feb 19 08:47:32 crc kubenswrapper[4788]: I0219 08:47:32.865157 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6","Type":"ContainerDied","Data":"395ae55b0796295c9c008d05ef7f24f2fce93a8d505893a32fcd7570162e4153"} Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.685174 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:33 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:33 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:33 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.685470 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.837705 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.839311 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.841725 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.853489 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.860058 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.982221 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91ae2845-8dab-4371-a556-b0c758f85110-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"91ae2845-8dab-4371-a556-b0c758f85110\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:47:33 crc kubenswrapper[4788]: I0219 08:47:33.982489 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91ae2845-8dab-4371-a556-b0c758f85110-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"91ae2845-8dab-4371-a556-b0c758f85110\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.085452 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91ae2845-8dab-4371-a556-b0c758f85110-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"91ae2845-8dab-4371-a556-b0c758f85110\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.085536 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91ae2845-8dab-4371-a556-b0c758f85110-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"91ae2845-8dab-4371-a556-b0c758f85110\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.086025 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91ae2845-8dab-4371-a556-b0c758f85110-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"91ae2845-8dab-4371-a556-b0c758f85110\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.165609 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91ae2845-8dab-4371-a556-b0c758f85110-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"91ae2845-8dab-4371-a556-b0c758f85110\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.188579 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.322159 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.389234 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kube-api-access\") pod \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\" (UID: \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\") " Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.389363 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kubelet-dir\") pod \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\" (UID: \"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6\") " Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.389467 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a2171a56-7a4c-4bde-8d04-a1510d4e4ed6" (UID: "a2171a56-7a4c-4bde-8d04-a1510d4e4ed6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.389971 4788 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.399096 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a2171a56-7a4c-4bde-8d04-a1510d4e4ed6" (UID: "a2171a56-7a4c-4bde-8d04-a1510d4e4ed6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.491218 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2171a56-7a4c-4bde-8d04-a1510d4e4ed6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.682886 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:34 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:34 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:34 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.682968 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.738088 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.913334 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a2171a56-7a4c-4bde-8d04-a1510d4e4ed6","Type":"ContainerDied","Data":"d92f85f3b6f2788c2f570794f631d937a79f3ae308a0657b5e23a8c5d8cff148"} Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.913380 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d92f85f3b6f2788c2f570794f631d937a79f3ae308a0657b5e23a8c5d8cff148" Feb 19 08:47:34 crc kubenswrapper[4788]: I0219 08:47:34.913385 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 19 08:47:35 crc kubenswrapper[4788]: I0219 08:47:35.685816 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:35 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:35 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:35 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:35 crc kubenswrapper[4788]: I0219 08:47:35.686144 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:35 crc kubenswrapper[4788]: I0219 08:47:35.848212 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hbn8p" Feb 19 08:47:36 crc kubenswrapper[4788]: I0219 08:47:36.685115 4788 patch_prober.go:28] interesting pod/router-default-5444994796-8tvwh container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 19 08:47:36 crc kubenswrapper[4788]: [-]has-synced failed: reason withheld Feb 19 08:47:36 crc kubenswrapper[4788]: [+]process-running ok Feb 19 08:47:36 crc kubenswrapper[4788]: healthz check failed Feb 19 08:47:36 crc kubenswrapper[4788]: I0219 08:47:36.685178 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-8tvwh" podUID="164b83b5-7bd4-4bea-8f7e-76c83c46a4b0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 19 08:47:37 crc kubenswrapper[4788]: I0219 08:47:37.683300 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:37 crc kubenswrapper[4788]: I0219 08:47:37.688037 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-8tvwh" Feb 19 08:47:40 crc kubenswrapper[4788]: I0219 08:47:40.292039 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:40 crc kubenswrapper[4788]: I0219 08:47:40.296094 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:47:40 crc kubenswrapper[4788]: I0219 08:47:40.559219 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-h7qcn" Feb 19 08:47:41 crc kubenswrapper[4788]: W0219 08:47:41.965535 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod91ae2845_8dab_4371_a556_b0c758f85110.slice/crio-1636c72bc2ea250724354f22da2d7a5fe441dbf815e9b00d10739ccb7198f7be WatchSource:0}: Error finding container 1636c72bc2ea250724354f22da2d7a5fe441dbf815e9b00d10739ccb7198f7be: Status 404 returned error can't find the container with id 1636c72bc2ea250724354f22da2d7a5fe441dbf815e9b00d10739ccb7198f7be Feb 19 08:47:42 crc kubenswrapper[4788]: I0219 08:47:42.224516 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:42 crc kubenswrapper[4788]: I0219 08:47:42.231861 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ad68454a-3350-49a5-9047-8b78e81ec79c-metrics-certs\") pod \"network-metrics-daemon-qbwlq\" (UID: \"ad68454a-3350-49a5-9047-8b78e81ec79c\") " pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:42 crc kubenswrapper[4788]: I0219 08:47:42.338131 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qbwlq" Feb 19 08:47:42 crc kubenswrapper[4788]: I0219 08:47:42.978463 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"91ae2845-8dab-4371-a556-b0c758f85110","Type":"ContainerStarted","Data":"1636c72bc2ea250724354f22da2d7a5fe441dbf815e9b00d10739ccb7198f7be"} Feb 19 08:47:48 crc kubenswrapper[4788]: I0219 08:47:48.423185 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:47:52 crc kubenswrapper[4788]: I0219 08:47:52.139222 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:47:52 crc kubenswrapper[4788]: I0219 08:47:52.139765 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:47:56 crc kubenswrapper[4788]: E0219 08:47:56.352285 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 08:47:56 crc kubenswrapper[4788]: E0219 08:47:56.353003 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2cj29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wmfkx_openshift-marketplace(c3846ca6-3c9c-4f02-978e-bee6148e0ba7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 08:47:56 crc kubenswrapper[4788]: E0219 08:47:56.354194 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wmfkx" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" Feb 19 08:47:57 crc kubenswrapper[4788]: E0219 08:47:57.937840 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wmfkx" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" Feb 19 08:47:58 crc kubenswrapper[4788]: E0219 08:47:58.063554 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 19 08:47:58 crc kubenswrapper[4788]: E0219 08:47:58.064123 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l6tvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-swpvd_openshift-marketplace(f727c8c6-b0d5-470e-bd9a-593b908dbef4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 08:47:58 crc kubenswrapper[4788]: E0219 08:47:58.066625 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-swpvd" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" Feb 19 08:47:58 crc kubenswrapper[4788]: E0219 08:47:58.133666 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 19 08:47:58 crc kubenswrapper[4788]: E0219 08:47:58.133972 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wbls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5blbm_openshift-marketplace(31ee37e0-613d-48b1-8e38-e8cc4608ee14): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 19 08:47:58 crc kubenswrapper[4788]: E0219 08:47:58.135907 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5blbm" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" Feb 19 08:47:58 crc kubenswrapper[4788]: I0219 08:47:58.346338 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qbwlq"] Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.067137 4788 generic.go:334] "Generic (PLEG): container finished" podID="218f4f15-8940-4e61-b94f-9967289b9846" containerID="5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512" exitCode=0 Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.067418 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlmp7" event={"ID":"218f4f15-8940-4e61-b94f-9967289b9846","Type":"ContainerDied","Data":"5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512"} Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.069983 4788 generic.go:334] "Generic (PLEG): container finished" podID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerID="25fb1865e57dbb0c181ffdf77098a9b436c21747eca32848258acb17609045ac" exitCode=0 Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.070019 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkp5" event={"ID":"4ef38e88-8e5c-4b56-8123-a60a3eded0a7","Type":"ContainerDied","Data":"25fb1865e57dbb0c181ffdf77098a9b436c21747eca32848258acb17609045ac"} Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.073780 4788 generic.go:334] "Generic (PLEG): container finished" podID="91ae2845-8dab-4371-a556-b0c758f85110" containerID="e8b047b8d53687bc806ffea06b116d69a5a6870762b8f0e31877efa210cbc8df" exitCode=0 Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.073891 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"91ae2845-8dab-4371-a556-b0c758f85110","Type":"ContainerDied","Data":"e8b047b8d53687bc806ffea06b116d69a5a6870762b8f0e31877efa210cbc8df"} Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.076298 4788 generic.go:334] "Generic (PLEG): container finished" podID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerID="3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be" exitCode=0 Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.076333 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2fqp" event={"ID":"85dfd540-d029-4a79-a997-3f2f3796b7b1","Type":"ContainerDied","Data":"3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be"} Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.079662 4788 generic.go:334] "Generic (PLEG): container finished" podID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerID="4f673db51164b818979244b252175bcb0e084f76410a5a27cc116b129c035463" exitCode=0 Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.079765 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjnxw" event={"ID":"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6","Type":"ContainerDied","Data":"4f673db51164b818979244b252175bcb0e084f76410a5a27cc116b129c035463"} Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.095332 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" event={"ID":"ad68454a-3350-49a5-9047-8b78e81ec79c","Type":"ContainerStarted","Data":"848b35e555953acabeea4ca11e8754864e78997e39bf48786a804918238f2f80"} Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.095389 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" event={"ID":"ad68454a-3350-49a5-9047-8b78e81ec79c","Type":"ContainerStarted","Data":"fe326332be065392c8094e165a3f264557d77c69a0f592ca153c1834994785c0"} Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.095852 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qbwlq" event={"ID":"ad68454a-3350-49a5-9047-8b78e81ec79c","Type":"ContainerStarted","Data":"820ff6900eba313960450856322c719150d48948eb7d42bdb2f144bd63ffacca"} Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.098904 4788 generic.go:334] "Generic (PLEG): container finished" podID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerID="57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471" exitCode=0 Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.099101 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx2sh" event={"ID":"eaaee2a4-db49-437f-a87b-98beb5e66e91","Type":"ContainerDied","Data":"57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471"} Feb 19 08:47:59 crc kubenswrapper[4788]: E0219 08:47:59.101441 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-swpvd" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" Feb 19 08:47:59 crc kubenswrapper[4788]: E0219 08:47:59.102314 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5blbm" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" Feb 19 08:47:59 crc kubenswrapper[4788]: I0219 08:47:59.206808 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qbwlq" podStartSLOduration=159.206764597 podStartE2EDuration="2m39.206764597s" podCreationTimestamp="2026-02-19 08:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:47:59.20369946 +0000 UTC m=+181.191710932" watchObservedRunningTime="2026-02-19 08:47:59.206764597 +0000 UTC m=+181.194776069" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.127039 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx2sh" event={"ID":"eaaee2a4-db49-437f-a87b-98beb5e66e91","Type":"ContainerStarted","Data":"6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a"} Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.129385 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkp5" event={"ID":"4ef38e88-8e5c-4b56-8123-a60a3eded0a7","Type":"ContainerStarted","Data":"77d2c02686781e1a8f98a4d0e200cc3616985a44a131fa25cf883794927f274f"} Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.131578 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlmp7" event={"ID":"218f4f15-8940-4e61-b94f-9967289b9846","Type":"ContainerStarted","Data":"c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3"} Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.133557 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2fqp" event={"ID":"85dfd540-d029-4a79-a997-3f2f3796b7b1","Type":"ContainerStarted","Data":"65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c"} Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.136530 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjnxw" event={"ID":"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6","Type":"ContainerStarted","Data":"cf3f9147848bcc3843c38d1c06161708f9b48700981e46db894060812e72d373"} Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.181845 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fx2sh" podStartSLOduration=2.483399009 podStartE2EDuration="30.181830264s" podCreationTimestamp="2026-02-19 08:47:30 +0000 UTC" firstStartedPulling="2026-02-19 08:47:31.804516812 +0000 UTC m=+153.792528274" lastFinishedPulling="2026-02-19 08:47:59.502948057 +0000 UTC m=+181.490959529" observedRunningTime="2026-02-19 08:48:00.150693183 +0000 UTC m=+182.138704665" watchObservedRunningTime="2026-02-19 08:48:00.181830264 +0000 UTC m=+182.169841736" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.201456 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s2fqp" podStartSLOduration=2.432933503 podStartE2EDuration="31.201440816s" podCreationTimestamp="2026-02-19 08:47:29 +0000 UTC" firstStartedPulling="2026-02-19 08:47:30.721733671 +0000 UTC m=+152.709745143" lastFinishedPulling="2026-02-19 08:47:59.490240984 +0000 UTC m=+181.478252456" observedRunningTime="2026-02-19 08:48:00.182411441 +0000 UTC m=+182.170422913" watchObservedRunningTime="2026-02-19 08:48:00.201440816 +0000 UTC m=+182.189452288" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.202801 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fhkp5" podStartSLOduration=3.390358846 podStartE2EDuration="31.202796855s" podCreationTimestamp="2026-02-19 08:47:29 +0000 UTC" firstStartedPulling="2026-02-19 08:47:31.821835168 +0000 UTC m=+153.809846630" lastFinishedPulling="2026-02-19 08:47:59.634273167 +0000 UTC m=+181.622284639" observedRunningTime="2026-02-19 08:48:00.201512648 +0000 UTC m=+182.189524130" watchObservedRunningTime="2026-02-19 08:48:00.202796855 +0000 UTC m=+182.190808327" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.230678 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dlmp7" podStartSLOduration=2.364628647 podStartE2EDuration="33.230661452s" podCreationTimestamp="2026-02-19 08:47:27 +0000 UTC" firstStartedPulling="2026-02-19 08:47:28.608430917 +0000 UTC m=+150.596442389" lastFinishedPulling="2026-02-19 08:47:59.474463722 +0000 UTC m=+181.462475194" observedRunningTime="2026-02-19 08:48:00.228188022 +0000 UTC m=+182.216199514" watchObservedRunningTime="2026-02-19 08:48:00.230661452 +0000 UTC m=+182.218672924" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.255794 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bjnxw" podStartSLOduration=2.474707959 podStartE2EDuration="30.255778961s" podCreationTimestamp="2026-02-19 08:47:30 +0000 UTC" firstStartedPulling="2026-02-19 08:47:31.798832979 +0000 UTC m=+153.786844451" lastFinishedPulling="2026-02-19 08:47:59.579903981 +0000 UTC m=+181.567915453" observedRunningTime="2026-02-19 08:48:00.252366714 +0000 UTC m=+182.240378216" watchObservedRunningTime="2026-02-19 08:48:00.255778961 +0000 UTC m=+182.243790433" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.489971 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.523479 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.523529 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.580644 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91ae2845-8dab-4371-a556-b0c758f85110-kubelet-dir\") pod \"91ae2845-8dab-4371-a556-b0c758f85110\" (UID: \"91ae2845-8dab-4371-a556-b0c758f85110\") " Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.580753 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ae2845-8dab-4371-a556-b0c758f85110-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "91ae2845-8dab-4371-a556-b0c758f85110" (UID: "91ae2845-8dab-4371-a556-b0c758f85110"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.580808 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91ae2845-8dab-4371-a556-b0c758f85110-kube-api-access\") pod \"91ae2845-8dab-4371-a556-b0c758f85110\" (UID: \"91ae2845-8dab-4371-a556-b0c758f85110\") " Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.581005 4788 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/91ae2845-8dab-4371-a556-b0c758f85110-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.593980 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91ae2845-8dab-4371-a556-b0c758f85110-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "91ae2845-8dab-4371-a556-b0c758f85110" (UID: "91ae2845-8dab-4371-a556-b0c758f85110"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.681693 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/91ae2845-8dab-4371-a556-b0c758f85110-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.784773 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-j6k7l" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.961765 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:48:00 crc kubenswrapper[4788]: I0219 08:48:00.961808 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:48:01 crc kubenswrapper[4788]: I0219 08:48:01.142370 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 19 08:48:01 crc kubenswrapper[4788]: I0219 08:48:01.142424 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"91ae2845-8dab-4371-a556-b0c758f85110","Type":"ContainerDied","Data":"1636c72bc2ea250724354f22da2d7a5fe441dbf815e9b00d10739ccb7198f7be"} Feb 19 08:48:01 crc kubenswrapper[4788]: I0219 08:48:01.142475 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1636c72bc2ea250724354f22da2d7a5fe441dbf815e9b00d10739ccb7198f7be" Feb 19 08:48:01 crc kubenswrapper[4788]: I0219 08:48:01.694592 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fx2sh" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="registry-server" probeResult="failure" output=< Feb 19 08:48:01 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 08:48:01 crc kubenswrapper[4788]: > Feb 19 08:48:02 crc kubenswrapper[4788]: I0219 08:48:02.001576 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bjnxw" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="registry-server" probeResult="failure" output=< Feb 19 08:48:02 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 08:48:02 crc kubenswrapper[4788]: > Feb 19 08:48:06 crc kubenswrapper[4788]: I0219 08:48:06.788324 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 19 08:48:07 crc kubenswrapper[4788]: I0219 08:48:07.965099 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:48:07 crc kubenswrapper[4788]: I0219 08:48:07.965188 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:48:08 crc kubenswrapper[4788]: I0219 08:48:08.029139 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:48:08 crc kubenswrapper[4788]: I0219 08:48:08.227015 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:48:08 crc kubenswrapper[4788]: I0219 08:48:08.279819 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlmp7"] Feb 19 08:48:08 crc kubenswrapper[4788]: I0219 08:48:08.735459 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xw477"] Feb 19 08:48:09 crc kubenswrapper[4788]: I0219 08:48:09.539539 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:48:09 crc kubenswrapper[4788]: I0219 08:48:09.539949 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:48:09 crc kubenswrapper[4788]: I0219 08:48:09.589016 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:48:09 crc kubenswrapper[4788]: I0219 08:48:09.946118 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:48:09 crc kubenswrapper[4788]: I0219 08:48:09.946168 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:48:09 crc kubenswrapper[4788]: I0219 08:48:09.994930 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.192099 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dlmp7" podUID="218f4f15-8940-4e61-b94f-9967289b9846" containerName="registry-server" containerID="cri-o://c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3" gracePeriod=2 Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.228271 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.230765 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:48:10 crc kubenswrapper[4788]: E0219 08:48:10.363903 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod218f4f15_8940_4e61_b94f_9967289b9846.slice/crio-conmon-c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3.scope\": RecentStats: unable to find data in memory cache]" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.578160 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.618366 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.621328 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.659391 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkp5"] Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.701542 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-catalog-content\") pod \"218f4f15-8940-4e61-b94f-9967289b9846\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.701611 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddld7\" (UniqueName: \"kubernetes.io/projected/218f4f15-8940-4e61-b94f-9967289b9846-kube-api-access-ddld7\") pod \"218f4f15-8940-4e61-b94f-9967289b9846\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.701759 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-utilities\") pod \"218f4f15-8940-4e61-b94f-9967289b9846\" (UID: \"218f4f15-8940-4e61-b94f-9967289b9846\") " Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.703102 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-utilities" (OuterVolumeSpecName: "utilities") pod "218f4f15-8940-4e61-b94f-9967289b9846" (UID: "218f4f15-8940-4e61-b94f-9967289b9846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.708476 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218f4f15-8940-4e61-b94f-9967289b9846-kube-api-access-ddld7" (OuterVolumeSpecName: "kube-api-access-ddld7") pod "218f4f15-8940-4e61-b94f-9967289b9846" (UID: "218f4f15-8940-4e61-b94f-9967289b9846"). InnerVolumeSpecName "kube-api-access-ddld7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.770506 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "218f4f15-8940-4e61-b94f-9967289b9846" (UID: "218f4f15-8940-4e61-b94f-9967289b9846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.803705 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.803747 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/218f4f15-8940-4e61-b94f-9967289b9846-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:10 crc kubenswrapper[4788]: I0219 08:48:10.803760 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddld7\" (UniqueName: \"kubernetes.io/projected/218f4f15-8940-4e61-b94f-9967289b9846-kube-api-access-ddld7\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.005711 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.056724 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.199979 4788 generic.go:334] "Generic (PLEG): container finished" podID="218f4f15-8940-4e61-b94f-9967289b9846" containerID="c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3" exitCode=0 Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.200042 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlmp7" event={"ID":"218f4f15-8940-4e61-b94f-9967289b9846","Type":"ContainerDied","Data":"c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3"} Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.200075 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlmp7" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.200103 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlmp7" event={"ID":"218f4f15-8940-4e61-b94f-9967289b9846","Type":"ContainerDied","Data":"7fcc40e95e4424faaa199789e7e493dee097b4200e5844861960f4496fe99995"} Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.200126 4788 scope.go:117] "RemoveContainer" containerID="c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.237800 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlmp7"] Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.240322 4788 scope.go:117] "RemoveContainer" containerID="5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.241937 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dlmp7"] Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.268474 4788 scope.go:117] "RemoveContainer" containerID="af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.298089 4788 scope.go:117] "RemoveContainer" containerID="c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3" Feb 19 08:48:11 crc kubenswrapper[4788]: E0219 08:48:11.300348 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3\": container with ID starting with c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3 not found: ID does not exist" containerID="c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.300393 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3"} err="failed to get container status \"c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3\": rpc error: code = NotFound desc = could not find container \"c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3\": container with ID starting with c5efdec646a38395f3fad723e9816c6d3f6800d5b072fdfa41df17d0e6651ac3 not found: ID does not exist" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.300440 4788 scope.go:117] "RemoveContainer" containerID="5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512" Feb 19 08:48:11 crc kubenswrapper[4788]: E0219 08:48:11.300927 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512\": container with ID starting with 5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512 not found: ID does not exist" containerID="5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.300961 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512"} err="failed to get container status \"5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512\": rpc error: code = NotFound desc = could not find container \"5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512\": container with ID starting with 5e989d709a3c761f599b4d33510685268c6f27534665dfa6f8ac47e77cf03512 not found: ID does not exist" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.300985 4788 scope.go:117] "RemoveContainer" containerID="af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8" Feb 19 08:48:11 crc kubenswrapper[4788]: E0219 08:48:11.302190 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8\": container with ID starting with af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8 not found: ID does not exist" containerID="af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.302255 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8"} err="failed to get container status \"af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8\": rpc error: code = NotFound desc = could not find container \"af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8\": container with ID starting with af127b10f7376c06bcc60e197cda03aa154ac3f0ad10b61a589184d47fbdb8d8 not found: ID does not exist" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.600697 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 08:48:11 crc kubenswrapper[4788]: E0219 08:48:11.600901 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218f4f15-8940-4e61-b94f-9967289b9846" containerName="extract-utilities" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.600913 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f4f15-8940-4e61-b94f-9967289b9846" containerName="extract-utilities" Feb 19 08:48:11 crc kubenswrapper[4788]: E0219 08:48:11.600921 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2171a56-7a4c-4bde-8d04-a1510d4e4ed6" containerName="pruner" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.600927 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2171a56-7a4c-4bde-8d04-a1510d4e4ed6" containerName="pruner" Feb 19 08:48:11 crc kubenswrapper[4788]: E0219 08:48:11.600939 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218f4f15-8940-4e61-b94f-9967289b9846" containerName="extract-content" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.600946 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f4f15-8940-4e61-b94f-9967289b9846" containerName="extract-content" Feb 19 08:48:11 crc kubenswrapper[4788]: E0219 08:48:11.600956 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218f4f15-8940-4e61-b94f-9967289b9846" containerName="registry-server" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.600962 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f4f15-8940-4e61-b94f-9967289b9846" containerName="registry-server" Feb 19 08:48:11 crc kubenswrapper[4788]: E0219 08:48:11.600972 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ae2845-8dab-4371-a556-b0c758f85110" containerName="pruner" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.600977 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ae2845-8dab-4371-a556-b0c758f85110" containerName="pruner" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.601057 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2171a56-7a4c-4bde-8d04-a1510d4e4ed6" containerName="pruner" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.601070 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ae2845-8dab-4371-a556-b0c758f85110" containerName="pruner" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.601076 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="218f4f15-8940-4e61-b94f-9967289b9846" containerName="registry-server" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.601392 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.603026 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.603487 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.611818 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed47f512-b2b5-479e-9508-aaf66c6cb137-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed47f512-b2b5-479e-9508-aaf66c6cb137\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.611852 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed47f512-b2b5-479e-9508-aaf66c6cb137-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed47f512-b2b5-479e-9508-aaf66c6cb137\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.615570 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.712917 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed47f512-b2b5-479e-9508-aaf66c6cb137-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed47f512-b2b5-479e-9508-aaf66c6cb137\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.712981 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed47f512-b2b5-479e-9508-aaf66c6cb137-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed47f512-b2b5-479e-9508-aaf66c6cb137\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.713065 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed47f512-b2b5-479e-9508-aaf66c6cb137-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ed47f512-b2b5-479e-9508-aaf66c6cb137\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.732016 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed47f512-b2b5-479e-9508-aaf66c6cb137-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ed47f512-b2b5-479e-9508-aaf66c6cb137\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:11 crc kubenswrapper[4788]: I0219 08:48:11.922484 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:12 crc kubenswrapper[4788]: I0219 08:48:12.208496 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fhkp5" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerName="registry-server" containerID="cri-o://77d2c02686781e1a8f98a4d0e200cc3616985a44a131fa25cf883794927f274f" gracePeriod=2 Feb 19 08:48:12 crc kubenswrapper[4788]: I0219 08:48:12.308774 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 19 08:48:12 crc kubenswrapper[4788]: W0219 08:48:12.316372 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poded47f512_b2b5_479e_9508_aaf66c6cb137.slice/crio-a6dc4d0c14d6ebc8f68c64bc50ab0b25379f8d2878c4de547e88d0e19c6f51c5 WatchSource:0}: Error finding container a6dc4d0c14d6ebc8f68c64bc50ab0b25379f8d2878c4de547e88d0e19c6f51c5: Status 404 returned error can't find the container with id a6dc4d0c14d6ebc8f68c64bc50ab0b25379f8d2878c4de547e88d0e19c6f51c5 Feb 19 08:48:12 crc kubenswrapper[4788]: I0219 08:48:12.723161 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218f4f15-8940-4e61-b94f-9967289b9846" path="/var/lib/kubelet/pods/218f4f15-8940-4e61-b94f-9967289b9846/volumes" Feb 19 08:48:12 crc kubenswrapper[4788]: I0219 08:48:12.855741 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bjnxw"] Feb 19 08:48:12 crc kubenswrapper[4788]: I0219 08:48:12.856011 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bjnxw" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="registry-server" containerID="cri-o://cf3f9147848bcc3843c38d1c06161708f9b48700981e46db894060812e72d373" gracePeriod=2 Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.241587 4788 generic.go:334] "Generic (PLEG): container finished" podID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerID="77d2c02686781e1a8f98a4d0e200cc3616985a44a131fa25cf883794927f274f" exitCode=0 Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.241645 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkp5" event={"ID":"4ef38e88-8e5c-4b56-8123-a60a3eded0a7","Type":"ContainerDied","Data":"77d2c02686781e1a8f98a4d0e200cc3616985a44a131fa25cf883794927f274f"} Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.243917 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ed47f512-b2b5-479e-9508-aaf66c6cb137","Type":"ContainerStarted","Data":"046187f3c0f60a416ec1e30834a5d8459bd7d2e7f9d7756c48ec7444848b66f1"} Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.244128 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ed47f512-b2b5-479e-9508-aaf66c6cb137","Type":"ContainerStarted","Data":"a6dc4d0c14d6ebc8f68c64bc50ab0b25379f8d2878c4de547e88d0e19c6f51c5"} Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.273721 4788 generic.go:334] "Generic (PLEG): container finished" podID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerID="cf3f9147848bcc3843c38d1c06161708f9b48700981e46db894060812e72d373" exitCode=0 Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.273768 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjnxw" event={"ID":"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6","Type":"ContainerDied","Data":"cf3f9147848bcc3843c38d1c06161708f9b48700981e46db894060812e72d373"} Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.274596 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.274573143 podStartE2EDuration="2.274573143s" podCreationTimestamp="2026-02-19 08:48:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:48:13.258413771 +0000 UTC m=+195.246425243" watchObservedRunningTime="2026-02-19 08:48:13.274573143 +0000 UTC m=+195.262584615" Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.774010 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.844875 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-utilities\") pod \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.844990 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-catalog-content\") pod \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.845046 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v7kn\" (UniqueName: \"kubernetes.io/projected/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-kube-api-access-2v7kn\") pod \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\" (UID: \"4ef38e88-8e5c-4b56-8123-a60a3eded0a7\") " Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.847425 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-utilities" (OuterVolumeSpecName: "utilities") pod "4ef38e88-8e5c-4b56-8123-a60a3eded0a7" (UID: "4ef38e88-8e5c-4b56-8123-a60a3eded0a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.856775 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-kube-api-access-2v7kn" (OuterVolumeSpecName: "kube-api-access-2v7kn") pod "4ef38e88-8e5c-4b56-8123-a60a3eded0a7" (UID: "4ef38e88-8e5c-4b56-8123-a60a3eded0a7"). InnerVolumeSpecName "kube-api-access-2v7kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.878396 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ef38e88-8e5c-4b56-8123-a60a3eded0a7" (UID: "4ef38e88-8e5c-4b56-8123-a60a3eded0a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.946686 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.946730 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v7kn\" (UniqueName: \"kubernetes.io/projected/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-kube-api-access-2v7kn\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:13 crc kubenswrapper[4788]: I0219 08:48:13.946745 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ef38e88-8e5c-4b56-8123-a60a3eded0a7-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.074963 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.147677 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-utilities\") pod \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.147732 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-catalog-content\") pod \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.147772 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gspsg\" (UniqueName: \"kubernetes.io/projected/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-kube-api-access-gspsg\") pod \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\" (UID: \"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6\") " Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.148525 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-utilities" (OuterVolumeSpecName: "utilities") pod "c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" (UID: "c056bee7-9698-4107-b2e8-dd9c2b3eb6a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.152383 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-kube-api-access-gspsg" (OuterVolumeSpecName: "kube-api-access-gspsg") pod "c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" (UID: "c056bee7-9698-4107-b2e8-dd9c2b3eb6a6"). InnerVolumeSpecName "kube-api-access-gspsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.248674 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.248710 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gspsg\" (UniqueName: \"kubernetes.io/projected/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-kube-api-access-gspsg\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.285547 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bjnxw" event={"ID":"c056bee7-9698-4107-b2e8-dd9c2b3eb6a6","Type":"ContainerDied","Data":"64358431445cdffd7cdf19876997ec4315aeef97b97afb98256d094b58c12287"} Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.285603 4788 scope.go:117] "RemoveContainer" containerID="cf3f9147848bcc3843c38d1c06161708f9b48700981e46db894060812e72d373" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.285560 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bjnxw" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.291037 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhkp5" event={"ID":"4ef38e88-8e5c-4b56-8123-a60a3eded0a7","Type":"ContainerDied","Data":"af99c4e7cb72bb6a0cf8a83d5cde041d5ddf987bfaae703840c4c340d427c286"} Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.291120 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhkp5" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.299915 4788 generic.go:334] "Generic (PLEG): container finished" podID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerID="c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa" exitCode=0 Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.299974 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmfkx" event={"ID":"c3846ca6-3c9c-4f02-978e-bee6148e0ba7","Type":"ContainerDied","Data":"c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa"} Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.299985 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" (UID: "c056bee7-9698-4107-b2e8-dd9c2b3eb6a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.304839 4788 generic.go:334] "Generic (PLEG): container finished" podID="ed47f512-b2b5-479e-9508-aaf66c6cb137" containerID="046187f3c0f60a416ec1e30834a5d8459bd7d2e7f9d7756c48ec7444848b66f1" exitCode=0 Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.304876 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ed47f512-b2b5-479e-9508-aaf66c6cb137","Type":"ContainerDied","Data":"046187f3c0f60a416ec1e30834a5d8459bd7d2e7f9d7756c48ec7444848b66f1"} Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.322460 4788 scope.go:117] "RemoveContainer" containerID="4f673db51164b818979244b252175bcb0e084f76410a5a27cc116b129c035463" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.336934 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkp5"] Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.342079 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhkp5"] Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.352337 4788 scope.go:117] "RemoveContainer" containerID="1ffe3a55100beeac8f126fdabb7d81205c0e293a125f022ff70a37ce71477e0a" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.352827 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.369478 4788 scope.go:117] "RemoveContainer" containerID="77d2c02686781e1a8f98a4d0e200cc3616985a44a131fa25cf883794927f274f" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.387178 4788 scope.go:117] "RemoveContainer" containerID="25fb1865e57dbb0c181ffdf77098a9b436c21747eca32848258acb17609045ac" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.414618 4788 scope.go:117] "RemoveContainer" containerID="5594757e59dde3811fa1cd268c9064df5e239699aaca31565a236c8175af6b5e" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.610735 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bjnxw"] Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.613488 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bjnxw"] Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.740831 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" path="/var/lib/kubelet/pods/4ef38e88-8e5c-4b56-8123-a60a3eded0a7/volumes" Feb 19 08:48:14 crc kubenswrapper[4788]: I0219 08:48:14.741432 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" path="/var/lib/kubelet/pods/c056bee7-9698-4107-b2e8-dd9c2b3eb6a6/volumes" Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.315982 4788 generic.go:334] "Generic (PLEG): container finished" podID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerID="d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88" exitCode=0 Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.316051 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swpvd" event={"ID":"f727c8c6-b0d5-470e-bd9a-593b908dbef4","Type":"ContainerDied","Data":"d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88"} Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.318981 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmfkx" event={"ID":"c3846ca6-3c9c-4f02-978e-bee6148e0ba7","Type":"ContainerStarted","Data":"eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa"} Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.369059 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wmfkx" podStartSLOduration=3.11990563 podStartE2EDuration="49.369039849s" podCreationTimestamp="2026-02-19 08:47:26 +0000 UTC" firstStartedPulling="2026-02-19 08:47:28.560076762 +0000 UTC m=+150.548088234" lastFinishedPulling="2026-02-19 08:48:14.809210981 +0000 UTC m=+196.797222453" observedRunningTime="2026-02-19 08:48:15.366237069 +0000 UTC m=+197.354248571" watchObservedRunningTime="2026-02-19 08:48:15.369039849 +0000 UTC m=+197.357051341" Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.610050 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.667707 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed47f512-b2b5-479e-9508-aaf66c6cb137-kube-api-access\") pod \"ed47f512-b2b5-479e-9508-aaf66c6cb137\" (UID: \"ed47f512-b2b5-479e-9508-aaf66c6cb137\") " Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.667827 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed47f512-b2b5-479e-9508-aaf66c6cb137-kubelet-dir\") pod \"ed47f512-b2b5-479e-9508-aaf66c6cb137\" (UID: \"ed47f512-b2b5-479e-9508-aaf66c6cb137\") " Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.668046 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ed47f512-b2b5-479e-9508-aaf66c6cb137-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ed47f512-b2b5-479e-9508-aaf66c6cb137" (UID: "ed47f512-b2b5-479e-9508-aaf66c6cb137"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.672891 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed47f512-b2b5-479e-9508-aaf66c6cb137-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ed47f512-b2b5-479e-9508-aaf66c6cb137" (UID: "ed47f512-b2b5-479e-9508-aaf66c6cb137"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.768944 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed47f512-b2b5-479e-9508-aaf66c6cb137-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:15 crc kubenswrapper[4788]: I0219 08:48:15.768985 4788 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed47f512-b2b5-479e-9508-aaf66c6cb137-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:16 crc kubenswrapper[4788]: I0219 08:48:16.325590 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ed47f512-b2b5-479e-9508-aaf66c6cb137","Type":"ContainerDied","Data":"a6dc4d0c14d6ebc8f68c64bc50ab0b25379f8d2878c4de547e88d0e19c6f51c5"} Feb 19 08:48:16 crc kubenswrapper[4788]: I0219 08:48:16.325622 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6dc4d0c14d6ebc8f68c64bc50ab0b25379f8d2878c4de547e88d0e19c6f51c5" Feb 19 08:48:16 crc kubenswrapper[4788]: I0219 08:48:16.325668 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 19 08:48:17 crc kubenswrapper[4788]: I0219 08:48:17.336083 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:48:17 crc kubenswrapper[4788]: I0219 08:48:17.336502 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:48:17 crc kubenswrapper[4788]: I0219 08:48:17.373099 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000418 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 08:48:18 crc kubenswrapper[4788]: E0219 08:48:18.000652 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerName="extract-utilities" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000667 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerName="extract-utilities" Feb 19 08:48:18 crc kubenswrapper[4788]: E0219 08:48:18.000682 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed47f512-b2b5-479e-9508-aaf66c6cb137" containerName="pruner" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000690 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed47f512-b2b5-479e-9508-aaf66c6cb137" containerName="pruner" Feb 19 08:48:18 crc kubenswrapper[4788]: E0219 08:48:18.000706 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerName="extract-content" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000714 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerName="extract-content" Feb 19 08:48:18 crc kubenswrapper[4788]: E0219 08:48:18.000728 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="extract-content" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000737 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="extract-content" Feb 19 08:48:18 crc kubenswrapper[4788]: E0219 08:48:18.000746 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="registry-server" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000754 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="registry-server" Feb 19 08:48:18 crc kubenswrapper[4788]: E0219 08:48:18.000767 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerName="registry-server" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000775 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerName="registry-server" Feb 19 08:48:18 crc kubenswrapper[4788]: E0219 08:48:18.000791 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="extract-utilities" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000801 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="extract-utilities" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000933 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c056bee7-9698-4107-b2e8-dd9c2b3eb6a6" containerName="registry-server" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000949 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef38e88-8e5c-4b56-8123-a60a3eded0a7" containerName="registry-server" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.000961 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed47f512-b2b5-479e-9508-aaf66c6cb137" containerName="pruner" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.001437 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.004434 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.004635 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.013820 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.099414 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.099639 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6604880-4b64-4653-bfa0-f2e6448d801f-kube-api-access\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.099713 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-var-lock\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.200979 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6604880-4b64-4653-bfa0-f2e6448d801f-kube-api-access\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.201049 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-var-lock\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.201094 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.201184 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.201195 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-var-lock\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.218504 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6604880-4b64-4653-bfa0-f2e6448d801f-kube-api-access\") pod \"installer-9-crc\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.317476 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.336898 4788 generic.go:334] "Generic (PLEG): container finished" podID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerID="89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4" exitCode=0 Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.336971 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5blbm" event={"ID":"31ee37e0-613d-48b1-8e38-e8cc4608ee14","Type":"ContainerDied","Data":"89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4"} Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.343969 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swpvd" event={"ID":"f727c8c6-b0d5-470e-bd9a-593b908dbef4","Type":"ContainerStarted","Data":"a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52"} Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.387828 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-swpvd" podStartSLOduration=2.7310410579999997 podStartE2EDuration="51.387811167s" podCreationTimestamp="2026-02-19 08:47:27 +0000 UTC" firstStartedPulling="2026-02-19 08:47:28.534031017 +0000 UTC m=+150.522042489" lastFinishedPulling="2026-02-19 08:48:17.190801126 +0000 UTC m=+199.178812598" observedRunningTime="2026-02-19 08:48:18.38582212 +0000 UTC m=+200.373833582" watchObservedRunningTime="2026-02-19 08:48:18.387811167 +0000 UTC m=+200.375822639" Feb 19 08:48:18 crc kubenswrapper[4788]: I0219 08:48:18.790468 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 19 08:48:19 crc kubenswrapper[4788]: I0219 08:48:19.351971 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f6604880-4b64-4653-bfa0-f2e6448d801f","Type":"ContainerStarted","Data":"daeff5521ddd17371de0c638c6aad5cc66a27cb3e43d5ec6ffd6949940bd4039"} Feb 19 08:48:19 crc kubenswrapper[4788]: I0219 08:48:19.352024 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f6604880-4b64-4653-bfa0-f2e6448d801f","Type":"ContainerStarted","Data":"7746ec2090aa46f02de0e079d213b481d9402bd64e4db3bbc59733d48bd07310"} Feb 19 08:48:19 crc kubenswrapper[4788]: I0219 08:48:19.355637 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5blbm" event={"ID":"31ee37e0-613d-48b1-8e38-e8cc4608ee14","Type":"ContainerStarted","Data":"25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a"} Feb 19 08:48:19 crc kubenswrapper[4788]: I0219 08:48:19.374991 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.37496351 podStartE2EDuration="2.37496351s" podCreationTimestamp="2026-02-19 08:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:48:19.371011907 +0000 UTC m=+201.359023379" watchObservedRunningTime="2026-02-19 08:48:19.37496351 +0000 UTC m=+201.362975022" Feb 19 08:48:19 crc kubenswrapper[4788]: I0219 08:48:19.390937 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5blbm" podStartSLOduration=2.134689094 podStartE2EDuration="52.390909117s" podCreationTimestamp="2026-02-19 08:47:27 +0000 UTC" firstStartedPulling="2026-02-19 08:47:28.612073601 +0000 UTC m=+150.600085073" lastFinishedPulling="2026-02-19 08:48:18.868293624 +0000 UTC m=+200.856305096" observedRunningTime="2026-02-19 08:48:19.389334972 +0000 UTC m=+201.377346454" watchObservedRunningTime="2026-02-19 08:48:19.390909117 +0000 UTC m=+201.378920639" Feb 19 08:48:22 crc kubenswrapper[4788]: I0219 08:48:22.139724 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:48:22 crc kubenswrapper[4788]: I0219 08:48:22.139814 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:48:22 crc kubenswrapper[4788]: I0219 08:48:22.139881 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:48:22 crc kubenswrapper[4788]: I0219 08:48:22.140825 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 08:48:22 crc kubenswrapper[4788]: I0219 08:48:22.140922 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96" gracePeriod=600 Feb 19 08:48:22 crc kubenswrapper[4788]: I0219 08:48:22.376874 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96" exitCode=0 Feb 19 08:48:22 crc kubenswrapper[4788]: I0219 08:48:22.376936 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96"} Feb 19 08:48:23 crc kubenswrapper[4788]: I0219 08:48:23.387448 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"487f4bc93be66363f5fd6689b686bc27a3cdc3fd662d9f911cdcd094586d7699"} Feb 19 08:48:27 crc kubenswrapper[4788]: I0219 08:48:27.395567 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:48:27 crc kubenswrapper[4788]: I0219 08:48:27.594926 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:48:27 crc kubenswrapper[4788]: I0219 08:48:27.595000 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:48:27 crc kubenswrapper[4788]: I0219 08:48:27.645172 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:48:27 crc kubenswrapper[4788]: I0219 08:48:27.723300 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:48:27 crc kubenswrapper[4788]: I0219 08:48:27.723436 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:48:27 crc kubenswrapper[4788]: I0219 08:48:27.792574 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:48:28 crc kubenswrapper[4788]: I0219 08:48:28.501948 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:48:28 crc kubenswrapper[4788]: I0219 08:48:28.503005 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:48:29 crc kubenswrapper[4788]: I0219 08:48:29.405821 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5blbm"] Feb 19 08:48:30 crc kubenswrapper[4788]: I0219 08:48:30.442475 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5blbm" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerName="registry-server" containerID="cri-o://25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a" gracePeriod=2 Feb 19 08:48:30 crc kubenswrapper[4788]: I0219 08:48:30.978240 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.070793 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-utilities\") pod \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.070911 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wbls\" (UniqueName: \"kubernetes.io/projected/31ee37e0-613d-48b1-8e38-e8cc4608ee14-kube-api-access-4wbls\") pod \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.070952 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-catalog-content\") pod \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\" (UID: \"31ee37e0-613d-48b1-8e38-e8cc4608ee14\") " Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.071670 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-utilities" (OuterVolumeSpecName: "utilities") pod "31ee37e0-613d-48b1-8e38-e8cc4608ee14" (UID: "31ee37e0-613d-48b1-8e38-e8cc4608ee14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.076561 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ee37e0-613d-48b1-8e38-e8cc4608ee14-kube-api-access-4wbls" (OuterVolumeSpecName: "kube-api-access-4wbls") pod "31ee37e0-613d-48b1-8e38-e8cc4608ee14" (UID: "31ee37e0-613d-48b1-8e38-e8cc4608ee14"). InnerVolumeSpecName "kube-api-access-4wbls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.152445 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31ee37e0-613d-48b1-8e38-e8cc4608ee14" (UID: "31ee37e0-613d-48b1-8e38-e8cc4608ee14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.172764 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.172797 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wbls\" (UniqueName: \"kubernetes.io/projected/31ee37e0-613d-48b1-8e38-e8cc4608ee14-kube-api-access-4wbls\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.172816 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ee37e0-613d-48b1-8e38-e8cc4608ee14-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.451345 4788 generic.go:334] "Generic (PLEG): container finished" podID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerID="25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a" exitCode=0 Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.451415 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5blbm" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.451427 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5blbm" event={"ID":"31ee37e0-613d-48b1-8e38-e8cc4608ee14","Type":"ContainerDied","Data":"25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a"} Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.451490 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5blbm" event={"ID":"31ee37e0-613d-48b1-8e38-e8cc4608ee14","Type":"ContainerDied","Data":"14b91294edc4866aaf09a102b0f887affbc1bc41f7f48726dfe45e54bb5b1116"} Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.451533 4788 scope.go:117] "RemoveContainer" containerID="25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.487234 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5blbm"] Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.487561 4788 scope.go:117] "RemoveContainer" containerID="89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.492991 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5blbm"] Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.512549 4788 scope.go:117] "RemoveContainer" containerID="f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.534751 4788 scope.go:117] "RemoveContainer" containerID="25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a" Feb 19 08:48:31 crc kubenswrapper[4788]: E0219 08:48:31.535316 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a\": container with ID starting with 25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a not found: ID does not exist" containerID="25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.535382 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a"} err="failed to get container status \"25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a\": rpc error: code = NotFound desc = could not find container \"25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a\": container with ID starting with 25a689ab43dc585687849351c9378fd61795788463e960d3df40a4cafab2c36a not found: ID does not exist" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.535417 4788 scope.go:117] "RemoveContainer" containerID="89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4" Feb 19 08:48:31 crc kubenswrapper[4788]: E0219 08:48:31.535906 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4\": container with ID starting with 89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4 not found: ID does not exist" containerID="89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.535946 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4"} err="failed to get container status \"89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4\": rpc error: code = NotFound desc = could not find container \"89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4\": container with ID starting with 89469a697ab26765ba2b62f3c4714d0794712602d5f341622dacfec8fc6a35e4 not found: ID does not exist" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.535972 4788 scope.go:117] "RemoveContainer" containerID="f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7" Feb 19 08:48:31 crc kubenswrapper[4788]: E0219 08:48:31.536387 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7\": container with ID starting with f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7 not found: ID does not exist" containerID="f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7" Feb 19 08:48:31 crc kubenswrapper[4788]: I0219 08:48:31.536486 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7"} err="failed to get container status \"f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7\": rpc error: code = NotFound desc = could not find container \"f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7\": container with ID starting with f6b5079491e184c6892bb424c51766b51261bfa402e685ddd8cceac43d7f2df7 not found: ID does not exist" Feb 19 08:48:32 crc kubenswrapper[4788]: I0219 08:48:32.727618 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" path="/var/lib/kubelet/pods/31ee37e0-613d-48b1-8e38-e8cc4608ee14/volumes" Feb 19 08:48:33 crc kubenswrapper[4788]: I0219 08:48:33.775231 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" podUID="49e1dd56-37f0-41b8-8afa-d040c5750fac" containerName="oauth-openshift" containerID="cri-o://156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad" gracePeriod=15 Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.176295 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.234640 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-login\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.234699 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-serving-cert\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.234731 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-error\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.234756 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-policies\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.235808 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236014 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-service-ca\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236065 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-ocp-branding-template\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236111 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-dir\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236143 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-session\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236188 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-router-certs\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236230 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-cliconfig\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236328 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-idp-0-file-data\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236391 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l5hg\" (UniqueName: \"kubernetes.io/projected/49e1dd56-37f0-41b8-8afa-d040c5750fac-kube-api-access-5l5hg\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236423 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-trusted-ca-bundle\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236471 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-provider-selection\") pod \"49e1dd56-37f0-41b8-8afa-d040c5750fac\" (UID: \"49e1dd56-37f0-41b8-8afa-d040c5750fac\") " Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236478 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236849 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.236874 4788 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.237331 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.238001 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.238069 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.259078 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.263815 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.264218 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.265714 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e1dd56-37f0-41b8-8afa-d040c5750fac-kube-api-access-5l5hg" (OuterVolumeSpecName: "kube-api-access-5l5hg") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "kube-api-access-5l5hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.268619 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.270417 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.272873 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.273330 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.273473 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49e1dd56-37f0-41b8-8afa-d040c5750fac" (UID: "49e1dd56-37f0-41b8-8afa-d040c5750fac"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337626 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l5hg\" (UniqueName: \"kubernetes.io/projected/49e1dd56-37f0-41b8-8afa-d040c5750fac-kube-api-access-5l5hg\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337669 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337685 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337700 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337713 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337725 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337736 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337749 4788 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49e1dd56-37f0-41b8-8afa-d040c5750fac-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337761 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337773 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337785 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.337797 4788 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49e1dd56-37f0-41b8-8afa-d040c5750fac-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.475034 4788 generic.go:334] "Generic (PLEG): container finished" podID="49e1dd56-37f0-41b8-8afa-d040c5750fac" containerID="156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad" exitCode=0 Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.475091 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" event={"ID":"49e1dd56-37f0-41b8-8afa-d040c5750fac","Type":"ContainerDied","Data":"156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad"} Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.475129 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" event={"ID":"49e1dd56-37f0-41b8-8afa-d040c5750fac","Type":"ContainerDied","Data":"578489b10acbd1a479d23c61f019c6f66049ebd3da497f7b5b2147eb21da71af"} Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.475157 4788 scope.go:117] "RemoveContainer" containerID="156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.475093 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xw477" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.509417 4788 scope.go:117] "RemoveContainer" containerID="156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad" Feb 19 08:48:34 crc kubenswrapper[4788]: E0219 08:48:34.510062 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad\": container with ID starting with 156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad not found: ID does not exist" containerID="156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.510124 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad"} err="failed to get container status \"156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad\": rpc error: code = NotFound desc = could not find container \"156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad\": container with ID starting with 156ccd00ad6e088f424589b8c70b823f327693239cc599c3a785d8bec6cb8cad not found: ID does not exist" Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.526488 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xw477"] Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.531767 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xw477"] Feb 19 08:48:34 crc kubenswrapper[4788]: I0219 08:48:34.725947 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e1dd56-37f0-41b8-8afa-d040c5750fac" path="/var/lib/kubelet/pods/49e1dd56-37f0-41b8-8afa-d040c5750fac/volumes" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.416544 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-tlcph"] Feb 19 08:48:41 crc kubenswrapper[4788]: E0219 08:48:41.418940 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerName="extract-content" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.419435 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerName="extract-content" Feb 19 08:48:41 crc kubenswrapper[4788]: E0219 08:48:41.419590 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerName="extract-utilities" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.419717 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerName="extract-utilities" Feb 19 08:48:41 crc kubenswrapper[4788]: E0219 08:48:41.419860 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerName="registry-server" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.419986 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerName="registry-server" Feb 19 08:48:41 crc kubenswrapper[4788]: E0219 08:48:41.420126 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e1dd56-37f0-41b8-8afa-d040c5750fac" containerName="oauth-openshift" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.420309 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e1dd56-37f0-41b8-8afa-d040c5750fac" containerName="oauth-openshift" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.420731 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ee37e0-613d-48b1-8e38-e8cc4608ee14" containerName="registry-server" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.420942 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e1dd56-37f0-41b8-8afa-d040c5750fac" containerName="oauth-openshift" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.421843 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.424996 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.425441 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.426555 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.430354 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.431027 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.431182 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.431836 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.432135 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.432334 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.432760 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.434387 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.435476 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.443402 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.449958 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.456909 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-tlcph"] Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.458495 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529304 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529354 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3241dde2-79f4-4b36-ae32-340844b391cb-audit-dir\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529424 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529550 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74tfp\" (UniqueName: \"kubernetes.io/projected/3241dde2-79f4-4b36-ae32-340844b391cb-kube-api-access-74tfp\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529648 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529705 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529774 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-audit-policies\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529897 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.529955 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.530029 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.530072 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.530132 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.530185 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.530353 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.631702 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.631783 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.631837 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.631953 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3241dde2-79f4-4b36-ae32-340844b391cb-audit-dir\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.631997 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632063 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632113 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tfp\" (UniqueName: \"kubernetes.io/projected/3241dde2-79f4-4b36-ae32-340844b391cb-kube-api-access-74tfp\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632165 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632195 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632226 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-audit-policies\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632336 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632375 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632416 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632449 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.632606 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3241dde2-79f4-4b36-ae32-340844b391cb-audit-dir\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.633680 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.633958 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-audit-policies\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.634380 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.634964 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.639375 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.640296 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-login\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.640727 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.640851 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.645670 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-session\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.648225 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.652936 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.653959 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3241dde2-79f4-4b36-ae32-340844b391cb-v4-0-config-user-template-error\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.660216 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tfp\" (UniqueName: \"kubernetes.io/projected/3241dde2-79f4-4b36-ae32-340844b391cb-kube-api-access-74tfp\") pod \"oauth-openshift-7448d7568b-tlcph\" (UID: \"3241dde2-79f4-4b36-ae32-340844b391cb\") " pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:41 crc kubenswrapper[4788]: I0219 08:48:41.758912 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:42 crc kubenswrapper[4788]: I0219 08:48:42.048079 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7448d7568b-tlcph"] Feb 19 08:48:42 crc kubenswrapper[4788]: I0219 08:48:42.528876 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" event={"ID":"3241dde2-79f4-4b36-ae32-340844b391cb","Type":"ContainerStarted","Data":"f47edf7dbfe80372dd85f80232e475611c853a26d697111da58fadd53127ad09"} Feb 19 08:48:42 crc kubenswrapper[4788]: I0219 08:48:42.528918 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" event={"ID":"3241dde2-79f4-4b36-ae32-340844b391cb","Type":"ContainerStarted","Data":"2d537ad0531ac27d6056c4fdbc5a528d692fa1f49489b23e105a7c55f51be0bd"} Feb 19 08:48:42 crc kubenswrapper[4788]: I0219 08:48:42.529300 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:42 crc kubenswrapper[4788]: I0219 08:48:42.572219 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" podStartSLOduration=34.572193017000004 podStartE2EDuration="34.572193017s" podCreationTimestamp="2026-02-19 08:48:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:48:42.569649634 +0000 UTC m=+224.557661146" watchObservedRunningTime="2026-02-19 08:48:42.572193017 +0000 UTC m=+224.560204509" Feb 19 08:48:42 crc kubenswrapper[4788]: I0219 08:48:42.794994 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7448d7568b-tlcph" Feb 19 08:48:56 crc kubenswrapper[4788]: E0219 08:48:56.892443 4788 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.894223 4788 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.895357 4788 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.895491 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.895768 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171" gracePeriod=15 Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.895777 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73" gracePeriod=15 Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.895851 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f" gracePeriod=15 Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.895870 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca" gracePeriod=15 Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.895907 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3" gracePeriod=15 Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.941073 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960112 4788 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 08:48:56 crc kubenswrapper[4788]: E0219 08:48:56.960357 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960369 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 08:48:56 crc kubenswrapper[4788]: E0219 08:48:56.960384 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960390 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 08:48:56 crc kubenswrapper[4788]: E0219 08:48:56.960397 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960403 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 08:48:56 crc kubenswrapper[4788]: E0219 08:48:56.960412 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960419 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 08:48:56 crc kubenswrapper[4788]: E0219 08:48:56.960427 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960435 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 08:48:56 crc kubenswrapper[4788]: E0219 08:48:56.960443 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960449 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960548 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960558 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960564 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960572 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960583 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 19 08:48:56 crc kubenswrapper[4788]: E0219 08:48:56.960677 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960685 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 08:48:56 crc kubenswrapper[4788]: I0219 08:48:56.960769 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.049526 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.049629 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.049671 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.049725 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.049812 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.049915 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.049973 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.050109 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151215 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151301 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151349 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151388 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151422 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151430 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151483 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151458 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151504 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151502 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151612 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151644 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151702 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151745 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151818 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.151840 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.231592 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:48:57 crc kubenswrapper[4788]: W0219 08:48:57.260883 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-ce3df7facf77367063c3bac5f3d05733d4a53963bae95a0f3e1c0fc3a5cbfb95 WatchSource:0}: Error finding container ce3df7facf77367063c3bac5f3d05733d4a53963bae95a0f3e1c0fc3a5cbfb95: Status 404 returned error can't find the container with id ce3df7facf77367063c3bac5f3d05733d4a53963bae95a0f3e1c0fc3a5cbfb95 Feb 19 08:48:57 crc kubenswrapper[4788]: E0219 08:48:57.266783 4788 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895999f268e46b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 08:48:57.265866416 +0000 UTC m=+239.253877888,LastTimestamp:2026-02-19 08:48:57.265866416 +0000 UTC m=+239.253877888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.640302 4788 generic.go:334] "Generic (PLEG): container finished" podID="f6604880-4b64-4653-bfa0-f2e6448d801f" containerID="daeff5521ddd17371de0c638c6aad5cc66a27cb3e43d5ec6ffd6949940bd4039" exitCode=0 Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.640429 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f6604880-4b64-4653-bfa0-f2e6448d801f","Type":"ContainerDied","Data":"daeff5521ddd17371de0c638c6aad5cc66a27cb3e43d5ec6ffd6949940bd4039"} Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.641395 4788 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.642075 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.642982 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.643983 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad"} Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.644024 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ce3df7facf77367063c3bac5f3d05733d4a53963bae95a0f3e1c0fc3a5cbfb95"} Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.644788 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.645362 4788 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.645878 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.647600 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.649444 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.650716 4788 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73" exitCode=0 Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.650768 4788 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3" exitCode=0 Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.650792 4788 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f" exitCode=0 Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.650817 4788 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca" exitCode=2 Feb 19 08:48:57 crc kubenswrapper[4788]: I0219 08:48:57.650794 4788 scope.go:117] "RemoveContainer" containerID="d3201660edccf83695ca6dcc8d81a07e6b55502d3be3413d9d1bee4c6b4c5d23" Feb 19 08:48:57 crc kubenswrapper[4788]: E0219 08:48:57.800706 4788 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.169:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" volumeName="registry-storage" Feb 19 08:48:58 crc kubenswrapper[4788]: E0219 08:48:58.527487 4788 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:58 crc kubenswrapper[4788]: E0219 08:48:58.528039 4788 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:58 crc kubenswrapper[4788]: E0219 08:48:58.528565 4788 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:58 crc kubenswrapper[4788]: E0219 08:48:58.528995 4788 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:58 crc kubenswrapper[4788]: E0219 08:48:58.529533 4788 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:58 crc kubenswrapper[4788]: I0219 08:48:58.529586 4788 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 19 08:48:58 crc kubenswrapper[4788]: E0219 08:48:58.530033 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="200ms" Feb 19 08:48:58 crc kubenswrapper[4788]: I0219 08:48:58.662527 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 08:48:58 crc kubenswrapper[4788]: I0219 08:48:58.721565 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:58 crc kubenswrapper[4788]: I0219 08:48:58.722051 4788 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:58 crc kubenswrapper[4788]: I0219 08:48:58.722566 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:58 crc kubenswrapper[4788]: E0219 08:48:58.731520 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="400ms" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.035486 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.036673 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.036920 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: E0219 08:48:59.132136 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="800ms" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.226476 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6604880-4b64-4653-bfa0-f2e6448d801f-kube-api-access\") pod \"f6604880-4b64-4653-bfa0-f2e6448d801f\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.226535 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-kubelet-dir\") pod \"f6604880-4b64-4653-bfa0-f2e6448d801f\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.226625 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-var-lock\") pod \"f6604880-4b64-4653-bfa0-f2e6448d801f\" (UID: \"f6604880-4b64-4653-bfa0-f2e6448d801f\") " Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.226906 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-var-lock" (OuterVolumeSpecName: "var-lock") pod "f6604880-4b64-4653-bfa0-f2e6448d801f" (UID: "f6604880-4b64-4653-bfa0-f2e6448d801f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.227445 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f6604880-4b64-4653-bfa0-f2e6448d801f" (UID: "f6604880-4b64-4653-bfa0-f2e6448d801f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.234573 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6604880-4b64-4653-bfa0-f2e6448d801f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f6604880-4b64-4653-bfa0-f2e6448d801f" (UID: "f6604880-4b64-4653-bfa0-f2e6448d801f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.286898 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.288060 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.288857 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.289458 4788 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.290037 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.327850 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6604880-4b64-4653-bfa0-f2e6448d801f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.328174 4788 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.328288 4788 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f6604880-4b64-4653-bfa0-f2e6448d801f-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.429412 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.429516 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.429536 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.429592 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.429642 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.429741 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.430053 4788 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.430086 4788 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.430110 4788 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.673711 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f6604880-4b64-4653-bfa0-f2e6448d801f","Type":"ContainerDied","Data":"7746ec2090aa46f02de0e079d213b481d9402bd64e4db3bbc59733d48bd07310"} Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.673764 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7746ec2090aa46f02de0e079d213b481d9402bd64e4db3bbc59733d48bd07310" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.673835 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.679476 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.680762 4788 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171" exitCode=0 Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.680846 4788 scope.go:117] "RemoveContainer" containerID="80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.680884 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.698922 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.699759 4788 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.700315 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.704759 4788 scope.go:117] "RemoveContainer" containerID="570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.715608 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.716172 4788 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.716795 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.725732 4788 scope.go:117] "RemoveContainer" containerID="67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.748594 4788 scope.go:117] "RemoveContainer" containerID="3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.768742 4788 scope.go:117] "RemoveContainer" containerID="d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.790787 4788 scope.go:117] "RemoveContainer" containerID="bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.813642 4788 scope.go:117] "RemoveContainer" containerID="80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73" Feb 19 08:48:59 crc kubenswrapper[4788]: E0219 08:48:59.814862 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\": container with ID starting with 80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73 not found: ID does not exist" containerID="80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.814962 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73"} err="failed to get container status \"80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\": rpc error: code = NotFound desc = could not find container \"80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73\": container with ID starting with 80bff58ca5cf7479e190ef948e0240f5f4d5e68303f086a03979acb1e2b82b73 not found: ID does not exist" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.815031 4788 scope.go:117] "RemoveContainer" containerID="570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3" Feb 19 08:48:59 crc kubenswrapper[4788]: E0219 08:48:59.815681 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\": container with ID starting with 570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3 not found: ID does not exist" containerID="570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.815739 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3"} err="failed to get container status \"570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\": rpc error: code = NotFound desc = could not find container \"570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3\": container with ID starting with 570542fe90eb50b90c64ab52b6943eec810bb1f0fcf9ebb3bd1b0909ac7e3dd3 not found: ID does not exist" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.815780 4788 scope.go:117] "RemoveContainer" containerID="67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f" Feb 19 08:48:59 crc kubenswrapper[4788]: E0219 08:48:59.816315 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\": container with ID starting with 67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f not found: ID does not exist" containerID="67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.816416 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f"} err="failed to get container status \"67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\": rpc error: code = NotFound desc = could not find container \"67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f\": container with ID starting with 67eb905a490a483e1bfc84ce46e7e8c43750ad2e7b067622af911dd57530175f not found: ID does not exist" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.816483 4788 scope.go:117] "RemoveContainer" containerID="3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca" Feb 19 08:48:59 crc kubenswrapper[4788]: E0219 08:48:59.817702 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\": container with ID starting with 3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca not found: ID does not exist" containerID="3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.817744 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca"} err="failed to get container status \"3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\": rpc error: code = NotFound desc = could not find container \"3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca\": container with ID starting with 3ae95b970ce72ff71f6b0316369fe60c05bc63eb2f5bde29ef59b315745ec8ca not found: ID does not exist" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.817770 4788 scope.go:117] "RemoveContainer" containerID="d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171" Feb 19 08:48:59 crc kubenswrapper[4788]: E0219 08:48:59.818146 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\": container with ID starting with d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171 not found: ID does not exist" containerID="d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.818196 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171"} err="failed to get container status \"d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\": rpc error: code = NotFound desc = could not find container \"d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171\": container with ID starting with d404f3c26ab6bcb4e11e0634f83461d4ea7820361794a946721d5bb4e4199171 not found: ID does not exist" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.818235 4788 scope.go:117] "RemoveContainer" containerID="bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69" Feb 19 08:48:59 crc kubenswrapper[4788]: E0219 08:48:59.818591 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\": container with ID starting with bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69 not found: ID does not exist" containerID="bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69" Feb 19 08:48:59 crc kubenswrapper[4788]: I0219 08:48:59.818619 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69"} err="failed to get container status \"bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\": rpc error: code = NotFound desc = could not find container \"bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69\": container with ID starting with bf2a51bff4b856552ebc8cf2d8f694ab91b878ab0b8542d4b2acd1c8c2edac69 not found: ID does not exist" Feb 19 08:48:59 crc kubenswrapper[4788]: E0219 08:48:59.933508 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="1.6s" Feb 19 08:49:00 crc kubenswrapper[4788]: I0219 08:49:00.724138 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 19 08:49:01 crc kubenswrapper[4788]: E0219 08:49:01.534213 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="3.2s" Feb 19 08:49:04 crc kubenswrapper[4788]: E0219 08:49:04.735182 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="6.4s" Feb 19 08:49:05 crc kubenswrapper[4788]: E0219 08:49:05.522534 4788 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.169:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895999f268e46b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-19 08:48:57.265866416 +0000 UTC m=+239.253877888,LastTimestamp:2026-02-19 08:48:57.265866416 +0000 UTC m=+239.253877888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 19 08:49:08 crc kubenswrapper[4788]: I0219 08:49:08.720611 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:49:08 crc kubenswrapper[4788]: I0219 08:49:08.721424 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:49:10 crc kubenswrapper[4788]: I0219 08:49:10.714439 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:10 crc kubenswrapper[4788]: I0219 08:49:10.715323 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:49:10 crc kubenswrapper[4788]: I0219 08:49:10.716018 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:49:10 crc kubenswrapper[4788]: I0219 08:49:10.739590 4788 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:10 crc kubenswrapper[4788]: I0219 08:49:10.739651 4788 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:10 crc kubenswrapper[4788]: E0219 08:49:10.741229 4788 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:10 crc kubenswrapper[4788]: I0219 08:49:10.742126 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:10 crc kubenswrapper[4788]: W0219 08:49:10.778159 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-495d414c86a56d6222981426170cf3e34c38667ade970594e76272a7b83b1284 WatchSource:0}: Error finding container 495d414c86a56d6222981426170cf3e34c38667ade970594e76272a7b83b1284: Status 404 returned error can't find the container with id 495d414c86a56d6222981426170cf3e34c38667ade970594e76272a7b83b1284 Feb 19 08:49:10 crc kubenswrapper[4788]: I0219 08:49:10.790879 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"495d414c86a56d6222981426170cf3e34c38667ade970594e76272a7b83b1284"} Feb 19 08:49:11 crc kubenswrapper[4788]: E0219 08:49:11.137363 4788 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.169:6443: connect: connection refused" interval="7s" Feb 19 08:49:11 crc kubenswrapper[4788]: I0219 08:49:11.801893 4788 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="473a9ff5999d860b227c4a39d8de9eae57c3927772746f2b7fc8f8b223e7a690" exitCode=0 Feb 19 08:49:11 crc kubenswrapper[4788]: I0219 08:49:11.802016 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"473a9ff5999d860b227c4a39d8de9eae57c3927772746f2b7fc8f8b223e7a690"} Feb 19 08:49:11 crc kubenswrapper[4788]: I0219 08:49:11.802384 4788 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:11 crc kubenswrapper[4788]: I0219 08:49:11.802423 4788 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:11 crc kubenswrapper[4788]: I0219 08:49:11.802973 4788 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:49:11 crc kubenswrapper[4788]: E0219 08:49:11.803032 4788 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:11 crc kubenswrapper[4788]: I0219 08:49:11.803479 4788 status_manager.go:851] "Failed to get status for pod" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.169:6443: connect: connection refused" Feb 19 08:49:12 crc kubenswrapper[4788]: I0219 08:49:12.810945 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 08:49:12 crc kubenswrapper[4788]: I0219 08:49:12.812010 4788 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57" exitCode=1 Feb 19 08:49:12 crc kubenswrapper[4788]: I0219 08:49:12.812133 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57"} Feb 19 08:49:12 crc kubenswrapper[4788]: I0219 08:49:12.812702 4788 scope.go:117] "RemoveContainer" containerID="3b265ecbc546474b0a0ddb1ea623593a4111faa8f073e60ce931e4c59f344c57" Feb 19 08:49:12 crc kubenswrapper[4788]: I0219 08:49:12.817588 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0730f3808fc0393b8d73e67a7faeed509bcb2d1aca03739e7534683824a63a0c"} Feb 19 08:49:12 crc kubenswrapper[4788]: I0219 08:49:12.817627 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d78ca20426db23877a6fa8898b8643260968c0e81a6b4f7edbb56de50cefafbd"} Feb 19 08:49:12 crc kubenswrapper[4788]: I0219 08:49:12.817640 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1848b1eab628b26b729ce5020b24cd8627c38d31168f80b1f6c0093f34fc84ce"} Feb 19 08:49:13 crc kubenswrapper[4788]: I0219 08:49:13.833540 4788 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:13 crc kubenswrapper[4788]: I0219 08:49:13.833543 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"115d4d2cde3cdd0a6147063d9bf22f5953d08c75dfb5e52fef614f1b52096d11"} Feb 19 08:49:13 crc kubenswrapper[4788]: I0219 08:49:13.833662 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:13 crc kubenswrapper[4788]: I0219 08:49:13.833681 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8a6283c0b93d167287d08e22b83d8c12cf2c3c112dbf5ef7964fed4a1ee51b32"} Feb 19 08:49:13 crc kubenswrapper[4788]: I0219 08:49:13.833583 4788 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:13 crc kubenswrapper[4788]: I0219 08:49:13.836509 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 19 08:49:13 crc kubenswrapper[4788]: I0219 08:49:13.836573 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"237dcef8e8dd3b6edd0dcfcd3ee2c3f0b3f468d3734f7dff12b4b8f2fc9c5048"} Feb 19 08:49:15 crc kubenswrapper[4788]: I0219 08:49:15.709501 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:49:15 crc kubenswrapper[4788]: I0219 08:49:15.742461 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:15 crc kubenswrapper[4788]: I0219 08:49:15.742525 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:15 crc kubenswrapper[4788]: I0219 08:49:15.748942 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:18 crc kubenswrapper[4788]: I0219 08:49:18.277995 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:49:18 crc kubenswrapper[4788]: I0219 08:49:18.283598 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:49:18 crc kubenswrapper[4788]: I0219 08:49:18.845998 4788 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:19 crc kubenswrapper[4788]: I0219 08:49:19.873388 4788 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:19 crc kubenswrapper[4788]: I0219 08:49:19.873424 4788 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:19 crc kubenswrapper[4788]: I0219 08:49:19.878496 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:19 crc kubenswrapper[4788]: I0219 08:49:19.880732 4788 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2bbc8e17-7a45-4618-97b3-69620b7e00be" Feb 19 08:49:20 crc kubenswrapper[4788]: I0219 08:49:20.880043 4788 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:20 crc kubenswrapper[4788]: I0219 08:49:20.881390 4788 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="5eb9b73e-bd92-4b79-97c3-d9b9955f8375" Feb 19 08:49:25 crc kubenswrapper[4788]: I0219 08:49:25.716608 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 19 08:49:28 crc kubenswrapper[4788]: I0219 08:49:28.280955 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 08:49:28 crc kubenswrapper[4788]: I0219 08:49:28.326654 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 19 08:49:28 crc kubenswrapper[4788]: I0219 08:49:28.425171 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 19 08:49:28 crc kubenswrapper[4788]: I0219 08:49:28.700578 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 19 08:49:28 crc kubenswrapper[4788]: I0219 08:49:28.710144 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 19 08:49:28 crc kubenswrapper[4788]: I0219 08:49:28.712104 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 19 08:49:28 crc kubenswrapper[4788]: I0219 08:49:28.733388 4788 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2bbc8e17-7a45-4618-97b3-69620b7e00be" Feb 19 08:49:29 crc kubenswrapper[4788]: I0219 08:49:29.129882 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 19 08:49:29 crc kubenswrapper[4788]: I0219 08:49:29.246844 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 19 08:49:29 crc kubenswrapper[4788]: I0219 08:49:29.306365 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 19 08:49:29 crc kubenswrapper[4788]: I0219 08:49:29.459167 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 19 08:49:29 crc kubenswrapper[4788]: I0219 08:49:29.869611 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 19 08:49:30 crc kubenswrapper[4788]: I0219 08:49:30.145285 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 19 08:49:30 crc kubenswrapper[4788]: I0219 08:49:30.231299 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 19 08:49:30 crc kubenswrapper[4788]: I0219 08:49:30.299688 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 19 08:49:30 crc kubenswrapper[4788]: I0219 08:49:30.528804 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 19 08:49:30 crc kubenswrapper[4788]: I0219 08:49:30.640820 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 19 08:49:30 crc kubenswrapper[4788]: I0219 08:49:30.864139 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 08:49:30 crc kubenswrapper[4788]: I0219 08:49:30.975522 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.056835 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.281954 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.487901 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.518476 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.583169 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.656200 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.680616 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.779053 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.789672 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.833089 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.889982 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 08:49:31 crc kubenswrapper[4788]: I0219 08:49:31.994538 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.358472 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.394550 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.431226 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.628587 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.697489 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.723568 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.753871 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.768740 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.859221 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.892024 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 19 08:49:32 crc kubenswrapper[4788]: I0219 08:49:32.893647 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.070139 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.121214 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.140734 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.278732 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.283851 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.327816 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.351904 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.370014 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.445013 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.453031 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.487624 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.505698 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.534215 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.588442 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.672741 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.708959 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.720519 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.753796 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.761526 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.920815 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.943961 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 19 08:49:33 crc kubenswrapper[4788]: I0219 08:49:33.978748 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.068847 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.109782 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.113529 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.200378 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.216401 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.267768 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.288014 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.358572 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.522393 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.531917 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.590357 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.686616 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.773887 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.801020 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.861767 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.871364 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.917935 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.948972 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 19 08:49:34 crc kubenswrapper[4788]: I0219 08:49:34.974877 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.024929 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.033440 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.088475 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.123682 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.183208 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.219636 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.220157 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.251792 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.253279 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.266573 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.440326 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.467347 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.477797 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.570034 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.590480 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.638129 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.686508 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.705758 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.707449 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.714047 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.739041 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.778968 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.879126 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.916845 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.924059 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.955551 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 19 08:49:35 crc kubenswrapper[4788]: I0219 08:49:35.966969 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.139520 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.168174 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.178750 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.326582 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.344038 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.355498 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.373970 4788 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.394616 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.595634 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.673400 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.679969 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.735403 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.773849 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.882223 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.896194 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 19 08:49:36 crc kubenswrapper[4788]: I0219 08:49:36.937831 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.035349 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.095300 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.155930 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.163182 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.294655 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.335197 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.469730 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.547414 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.593365 4788 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.642500 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.713358 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.713364 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 19 08:49:37 crc kubenswrapper[4788]: I0219 08:49:37.952583 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.153203 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.259539 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.325318 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.475495 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.591656 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.608596 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.614725 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.649116 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.683516 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.701098 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.720441 4788 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.793564 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.817999 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.896030 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 19 08:49:38 crc kubenswrapper[4788]: I0219 08:49:38.921842 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.004481 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.061124 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.250613 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.261591 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.330602 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.330654 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.372007 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.479580 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.499984 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.583793 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.781735 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.783549 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.804154 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.865614 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.884707 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.944414 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 19 08:49:39 crc kubenswrapper[4788]: I0219 08:49:39.960777 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.113102 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.121757 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.209647 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.268162 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.285129 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.286681 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.381526 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.401330 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.453735 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.478578 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.530781 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.633634 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.644708 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.677531 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.698150 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.708047 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.824844 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 19 08:49:40 crc kubenswrapper[4788]: I0219 08:49:40.948333 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.010144 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.016171 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.022469 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.116672 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.156373 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.168413 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.171915 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.180999 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.261007 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.285563 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.303013 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.314078 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.336918 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.418016 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.481850 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.560378 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.578046 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.588192 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.594434 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.633090 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.693904 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.858437 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.861936 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.879554 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 19 08:49:41 crc kubenswrapper[4788]: I0219 08:49:41.922780 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.016900 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.042638 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.068877 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.110107 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.371921 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.491454 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.513116 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.604971 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.793975 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.827585 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.961481 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 19 08:49:42 crc kubenswrapper[4788]: I0219 08:49:42.990106 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.000275 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.145136 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.159778 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.173342 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.535081 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.610914 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.667689 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.707644 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.769690 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.894748 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 19 08:49:43 crc kubenswrapper[4788]: I0219 08:49:43.894975 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.060220 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.153672 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.176412 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.190776 4788 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.281753 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.299386 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.313984 4788 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.397744 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.461170 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 19 08:49:44 crc kubenswrapper[4788]: I0219 08:49:44.928736 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.160362 4788 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.163464 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.163985 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=49.163970958 podStartE2EDuration="49.163970958s" podCreationTimestamp="2026-02-19 08:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:49:18.657654443 +0000 UTC m=+260.645665915" watchObservedRunningTime="2026-02-19 08:49:45.163970958 +0000 UTC m=+287.151982440" Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.164974 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.165019 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.171683 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.196927 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=27.196895626 podStartE2EDuration="27.196895626s" podCreationTimestamp="2026-02-19 08:49:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:49:45.187042658 +0000 UTC m=+287.175054220" watchObservedRunningTime="2026-02-19 08:49:45.196895626 +0000 UTC m=+287.184907138" Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.331057 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.448452 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 19 08:49:45 crc kubenswrapper[4788]: I0219 08:49:45.644998 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 19 08:49:46 crc kubenswrapper[4788]: I0219 08:49:46.018926 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 19 08:49:52 crc kubenswrapper[4788]: I0219 08:49:52.701707 4788 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 08:49:52 crc kubenswrapper[4788]: I0219 08:49:52.702703 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad" gracePeriod=5 Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.808747 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.809421 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.974773 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.974854 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.974885 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.974914 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.974933 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.975018 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.975154 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.975188 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.975358 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:49:57 crc kubenswrapper[4788]: I0219 08:49:57.985203 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.076004 4788 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.076043 4788 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.076056 4788 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.076066 4788 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.076075 4788 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.097056 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.097136 4788 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad" exitCode=137 Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.097217 4788 scope.go:117] "RemoveContainer" containerID="aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.097323 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.114869 4788 scope.go:117] "RemoveContainer" containerID="aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad" Feb 19 08:49:58 crc kubenswrapper[4788]: E0219 08:49:58.115403 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad\": container with ID starting with aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad not found: ID does not exist" containerID="aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.115505 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad"} err="failed to get container status \"aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad\": rpc error: code = NotFound desc = could not find container \"aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad\": container with ID starting with aaa46b6d952ddf041f4b6520c279f969439533fa88c03071dd62a187c36917ad not found: ID does not exist" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.504086 4788 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.731036 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.731528 4788 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.742766 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.743047 4788 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b0d42431-9367-401c-9d77-48eaca45cacc" Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.747186 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 19 08:49:58 crc kubenswrapper[4788]: I0219 08:49:58.747262 4788 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b0d42431-9367-401c-9d77-48eaca45cacc" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.374803 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swpvd"] Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.375077 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-swpvd" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerName="registry-server" containerID="cri-o://a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52" gracePeriod=30 Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.386291 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmfkx"] Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.386591 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wmfkx" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerName="registry-server" containerID="cri-o://eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa" gracePeriod=30 Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.399897 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jdjtx"] Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.400214 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" podUID="90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" containerName="marketplace-operator" containerID="cri-o://9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737" gracePeriod=30 Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.408343 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2fqp"] Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.408620 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s2fqp" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="registry-server" containerID="cri-o://65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c" gracePeriod=30 Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.413913 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fx2sh"] Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.414462 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fx2sh" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="registry-server" containerID="cri-o://6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a" gracePeriod=30 Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.432673 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fl6r"] Feb 19 08:49:59 crc kubenswrapper[4788]: E0219 08:49:59.433098 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" containerName="installer" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.433125 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" containerName="installer" Feb 19 08:49:59 crc kubenswrapper[4788]: E0219 08:49:59.433154 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.433169 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.433388 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.433423 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6604880-4b64-4653-bfa0-f2e6448d801f" containerName="installer" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.434073 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.441885 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fl6r"] Feb 19 08:49:59 crc kubenswrapper[4788]: E0219 08:49:59.539624 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c is running failed: container process not found" containerID="65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 08:49:59 crc kubenswrapper[4788]: E0219 08:49:59.539904 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c is running failed: container process not found" containerID="65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 08:49:59 crc kubenswrapper[4788]: E0219 08:49:59.540265 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c is running failed: container process not found" containerID="65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c" cmd=["grpc_health_probe","-addr=:50051"] Feb 19 08:49:59 crc kubenswrapper[4788]: E0219 08:49:59.540349 4788 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-s2fqp" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="registry-server" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.601629 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7smm\" (UniqueName: \"kubernetes.io/projected/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-kube-api-access-m7smm\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.601853 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.602261 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.703309 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7smm\" (UniqueName: \"kubernetes.io/projected/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-kube-api-access-m7smm\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.703357 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.703374 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.704548 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.710263 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.724889 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7smm\" (UniqueName: \"kubernetes.io/projected/d2a09672-cb2b-4c7a-93f2-78a0e16d752f-kube-api-access-m7smm\") pod \"marketplace-operator-79b997595-7fl6r\" (UID: \"d2a09672-cb2b-4c7a-93f2-78a0e16d752f\") " pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.802116 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.808021 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.817432 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.820288 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.820755 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:49:59 crc kubenswrapper[4788]: I0219 08:49:59.826516 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.005509 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-utilities\") pod \"eaaee2a4-db49-437f-a87b-98beb5e66e91\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.005844 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-catalog-content\") pod \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.005878 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-catalog-content\") pod \"eaaee2a4-db49-437f-a87b-98beb5e66e91\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.005903 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-operator-metrics\") pod \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.005928 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cj29\" (UniqueName: \"kubernetes.io/projected/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-kube-api-access-2cj29\") pod \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.005976 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-utilities\") pod \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.005990 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-utilities\") pod \"85dfd540-d029-4a79-a997-3f2f3796b7b1\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.006009 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn6bl\" (UniqueName: \"kubernetes.io/projected/85dfd540-d029-4a79-a997-3f2f3796b7b1-kube-api-access-jn6bl\") pod \"85dfd540-d029-4a79-a997-3f2f3796b7b1\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.006034 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bxqf\" (UniqueName: \"kubernetes.io/projected/eaaee2a4-db49-437f-a87b-98beb5e66e91-kube-api-access-4bxqf\") pod \"eaaee2a4-db49-437f-a87b-98beb5e66e91\" (UID: \"eaaee2a4-db49-437f-a87b-98beb5e66e91\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.006053 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-catalog-content\") pod \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\" (UID: \"c3846ca6-3c9c-4f02-978e-bee6148e0ba7\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.006106 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-trusted-ca\") pod \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.006124 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6tvr\" (UniqueName: \"kubernetes.io/projected/f727c8c6-b0d5-470e-bd9a-593b908dbef4-kube-api-access-l6tvr\") pod \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.006146 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-utilities\") pod \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\" (UID: \"f727c8c6-b0d5-470e-bd9a-593b908dbef4\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.006165 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljg8c\" (UniqueName: \"kubernetes.io/projected/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-kube-api-access-ljg8c\") pod \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\" (UID: \"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.006192 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-catalog-content\") pod \"85dfd540-d029-4a79-a997-3f2f3796b7b1\" (UID: \"85dfd540-d029-4a79-a997-3f2f3796b7b1\") " Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.007147 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-utilities" (OuterVolumeSpecName: "utilities") pod "eaaee2a4-db49-437f-a87b-98beb5e66e91" (UID: "eaaee2a4-db49-437f-a87b-98beb5e66e91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.007380 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-utilities" (OuterVolumeSpecName: "utilities") pod "f727c8c6-b0d5-470e-bd9a-593b908dbef4" (UID: "f727c8c6-b0d5-470e-bd9a-593b908dbef4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.007572 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-utilities" (OuterVolumeSpecName: "utilities") pod "85dfd540-d029-4a79-a997-3f2f3796b7b1" (UID: "85dfd540-d029-4a79-a997-3f2f3796b7b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.008741 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-utilities" (OuterVolumeSpecName: "utilities") pod "c3846ca6-3c9c-4f02-978e-bee6148e0ba7" (UID: "c3846ca6-3c9c-4f02-978e-bee6148e0ba7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.010798 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" (UID: "90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.011198 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85dfd540-d029-4a79-a997-3f2f3796b7b1-kube-api-access-jn6bl" (OuterVolumeSpecName: "kube-api-access-jn6bl") pod "85dfd540-d029-4a79-a997-3f2f3796b7b1" (UID: "85dfd540-d029-4a79-a997-3f2f3796b7b1"). InnerVolumeSpecName "kube-api-access-jn6bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.013648 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaaee2a4-db49-437f-a87b-98beb5e66e91-kube-api-access-4bxqf" (OuterVolumeSpecName: "kube-api-access-4bxqf") pod "eaaee2a4-db49-437f-a87b-98beb5e66e91" (UID: "eaaee2a4-db49-437f-a87b-98beb5e66e91"). InnerVolumeSpecName "kube-api-access-4bxqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.013698 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-kube-api-access-2cj29" (OuterVolumeSpecName: "kube-api-access-2cj29") pod "c3846ca6-3c9c-4f02-978e-bee6148e0ba7" (UID: "c3846ca6-3c9c-4f02-978e-bee6148e0ba7"). InnerVolumeSpecName "kube-api-access-2cj29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.021291 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f727c8c6-b0d5-470e-bd9a-593b908dbef4-kube-api-access-l6tvr" (OuterVolumeSpecName: "kube-api-access-l6tvr") pod "f727c8c6-b0d5-470e-bd9a-593b908dbef4" (UID: "f727c8c6-b0d5-470e-bd9a-593b908dbef4"). InnerVolumeSpecName "kube-api-access-l6tvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.021364 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" (UID: "90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.024333 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-kube-api-access-ljg8c" (OuterVolumeSpecName: "kube-api-access-ljg8c") pod "90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" (UID: "90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6"). InnerVolumeSpecName "kube-api-access-ljg8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.043554 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "85dfd540-d029-4a79-a997-3f2f3796b7b1" (UID: "85dfd540-d029-4a79-a997-3f2f3796b7b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.063522 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7fl6r"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.086020 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3846ca6-3c9c-4f02-978e-bee6148e0ba7" (UID: "c3846ca6-3c9c-4f02-978e-bee6148e0ba7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.086518 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f727c8c6-b0d5-470e-bd9a-593b908dbef4" (UID: "f727c8c6-b0d5-470e-bd9a-593b908dbef4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.107994 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108036 4788 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108053 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cj29\" (UniqueName: \"kubernetes.io/projected/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-kube-api-access-2cj29\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108066 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108076 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108086 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn6bl\" (UniqueName: \"kubernetes.io/projected/85dfd540-d029-4a79-a997-3f2f3796b7b1-kube-api-access-jn6bl\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108099 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bxqf\" (UniqueName: \"kubernetes.io/projected/eaaee2a4-db49-437f-a87b-98beb5e66e91-kube-api-access-4bxqf\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108110 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3846ca6-3c9c-4f02-978e-bee6148e0ba7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108122 4788 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108133 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6tvr\" (UniqueName: \"kubernetes.io/projected/f727c8c6-b0d5-470e-bd9a-593b908dbef4-kube-api-access-l6tvr\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108144 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f727c8c6-b0d5-470e-bd9a-593b908dbef4-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108153 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljg8c\" (UniqueName: \"kubernetes.io/projected/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6-kube-api-access-ljg8c\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108162 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85dfd540-d029-4a79-a997-3f2f3796b7b1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108174 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108607 4788 generic.go:334] "Generic (PLEG): container finished" podID="90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" containerID="9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737" exitCode=0 Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108678 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108683 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" event={"ID":"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6","Type":"ContainerDied","Data":"9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108751 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jdjtx" event={"ID":"90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6","Type":"ContainerDied","Data":"5bbd68de5318f9b02c4fcc865f459413989088ff7da20d8d8801de3659d85113"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.108779 4788 scope.go:117] "RemoveContainer" containerID="9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.111987 4788 generic.go:334] "Generic (PLEG): container finished" podID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerID="65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c" exitCode=0 Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.112041 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s2fqp" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.112063 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2fqp" event={"ID":"85dfd540-d029-4a79-a997-3f2f3796b7b1","Type":"ContainerDied","Data":"65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.112095 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s2fqp" event={"ID":"85dfd540-d029-4a79-a997-3f2f3796b7b1","Type":"ContainerDied","Data":"504663f4a4a18e535cf7331a360a29e3a7e3ea772de315804a8f78b8d8288857"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.113969 4788 generic.go:334] "Generic (PLEG): container finished" podID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerID="6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a" exitCode=0 Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.114028 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx2sh" event={"ID":"eaaee2a4-db49-437f-a87b-98beb5e66e91","Type":"ContainerDied","Data":"6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.114056 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx2sh" event={"ID":"eaaee2a4-db49-437f-a87b-98beb5e66e91","Type":"ContainerDied","Data":"3228cb949f6837237b48ba9f037c959354c8150f23fc39a74a223bc0ce09c50e"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.114117 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx2sh" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.119409 4788 generic.go:334] "Generic (PLEG): container finished" podID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerID="a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52" exitCode=0 Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.119485 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swpvd" event={"ID":"f727c8c6-b0d5-470e-bd9a-593b908dbef4","Type":"ContainerDied","Data":"a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.119505 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swpvd" event={"ID":"f727c8c6-b0d5-470e-bd9a-593b908dbef4","Type":"ContainerDied","Data":"ea43d555964b365165ec2df291041b39b6f22a8cf71c39b13d941dbdafbbe563"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.119600 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swpvd" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.122263 4788 generic.go:334] "Generic (PLEG): container finished" podID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerID="eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa" exitCode=0 Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.122322 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmfkx" event={"ID":"c3846ca6-3c9c-4f02-978e-bee6148e0ba7","Type":"ContainerDied","Data":"eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.122350 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wmfkx" event={"ID":"c3846ca6-3c9c-4f02-978e-bee6148e0ba7","Type":"ContainerDied","Data":"d4e36ac0cee73007b853f0c1c304199f2fcbea25fe5d1f20d0b2bdd88f73bce2"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.122451 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wmfkx" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.124717 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" event={"ID":"d2a09672-cb2b-4c7a-93f2-78a0e16d752f","Type":"ContainerStarted","Data":"8e93450d9b507f02ffafca07daa529ecd9634a65031cab25c8e2cbc719f48d64"} Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.139194 4788 scope.go:117] "RemoveContainer" containerID="9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.139543 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737\": container with ID starting with 9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737 not found: ID does not exist" containerID="9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.139579 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737"} err="failed to get container status \"9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737\": rpc error: code = NotFound desc = could not find container \"9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737\": container with ID starting with 9df8d7cd28685c9f0063b634a45c71e6dda1a610c6cdfef67847566ff83ac737 not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.139598 4788 scope.go:117] "RemoveContainer" containerID="65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.155429 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jdjtx"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.157752 4788 scope.go:117] "RemoveContainer" containerID="3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.165340 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jdjtx"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.171196 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wmfkx"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.175604 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wmfkx"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.179223 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2fqp"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.183337 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s2fqp"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.192278 4788 scope.go:117] "RemoveContainer" containerID="ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.192930 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swpvd"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.194904 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eaaee2a4-db49-437f-a87b-98beb5e66e91" (UID: "eaaee2a4-db49-437f-a87b-98beb5e66e91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.196476 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-swpvd"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.208919 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eaaee2a4-db49-437f-a87b-98beb5e66e91-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.222528 4788 scope.go:117] "RemoveContainer" containerID="65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.223018 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c\": container with ID starting with 65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c not found: ID does not exist" containerID="65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.223065 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c"} err="failed to get container status \"65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c\": rpc error: code = NotFound desc = could not find container \"65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c\": container with ID starting with 65ca8ff81f61b712b4bcef4a1588ead4932fe6757d106ee8009b5d8b02189d3c not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.223100 4788 scope.go:117] "RemoveContainer" containerID="3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.223521 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be\": container with ID starting with 3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be not found: ID does not exist" containerID="3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.223562 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be"} err="failed to get container status \"3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be\": rpc error: code = NotFound desc = could not find container \"3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be\": container with ID starting with 3c54b3808e0552c619d3914cb501ac16f79913e658f54925ef21e5b2c7d6a7be not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.223584 4788 scope.go:117] "RemoveContainer" containerID="ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.224027 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d\": container with ID starting with ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d not found: ID does not exist" containerID="ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.224118 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d"} err="failed to get container status \"ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d\": rpc error: code = NotFound desc = could not find container \"ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d\": container with ID starting with ac3f8ed3dcfc8253b3ee2a61d39029411bc301702a32188f364f7f9adbc6b00d not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.224301 4788 scope.go:117] "RemoveContainer" containerID="6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.247567 4788 scope.go:117] "RemoveContainer" containerID="57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.265352 4788 scope.go:117] "RemoveContainer" containerID="af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.281394 4788 scope.go:117] "RemoveContainer" containerID="6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.281985 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a\": container with ID starting with 6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a not found: ID does not exist" containerID="6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.282085 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a"} err="failed to get container status \"6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a\": rpc error: code = NotFound desc = could not find container \"6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a\": container with ID starting with 6fa11e8d13f3b6429c9d02610cd04899422653e8336ca94eff7dd38a671f063a not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.282192 4788 scope.go:117] "RemoveContainer" containerID="57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.284486 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471\": container with ID starting with 57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471 not found: ID does not exist" containerID="57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.284592 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471"} err="failed to get container status \"57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471\": rpc error: code = NotFound desc = could not find container \"57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471\": container with ID starting with 57ca5ca955b22eb0ed69e6ee86a3f5b17d765ccf015de1d464287d1f406c2471 not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.284676 4788 scope.go:117] "RemoveContainer" containerID="af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.285863 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d\": container with ID starting with af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d not found: ID does not exist" containerID="af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.285929 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d"} err="failed to get container status \"af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d\": rpc error: code = NotFound desc = could not find container \"af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d\": container with ID starting with af5bda2dfa60f08d7ab44a5b46ccb81d44046164981e60ba39b2ae1c021ae42d not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.285958 4788 scope.go:117] "RemoveContainer" containerID="a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.304484 4788 scope.go:117] "RemoveContainer" containerID="d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.321649 4788 scope.go:117] "RemoveContainer" containerID="85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.336303 4788 scope.go:117] "RemoveContainer" containerID="a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.336992 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52\": container with ID starting with a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52 not found: ID does not exist" containerID="a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.337051 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52"} err="failed to get container status \"a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52\": rpc error: code = NotFound desc = could not find container \"a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52\": container with ID starting with a24695bf80f084fcdd16ef9ca734c4171bb3aa33138f505e2df50b4ace5f7a52 not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.337091 4788 scope.go:117] "RemoveContainer" containerID="d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.337436 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88\": container with ID starting with d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88 not found: ID does not exist" containerID="d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.337472 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88"} err="failed to get container status \"d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88\": rpc error: code = NotFound desc = could not find container \"d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88\": container with ID starting with d397b44d67cf5a6edc9fcfa218ab6faf30b8845855e7793a8e16358234ab6c88 not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.337498 4788 scope.go:117] "RemoveContainer" containerID="85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.337915 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf\": container with ID starting with 85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf not found: ID does not exist" containerID="85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.337961 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf"} err="failed to get container status \"85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf\": rpc error: code = NotFound desc = could not find container \"85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf\": container with ID starting with 85af9860d53099e9a2b042134be4c710fb240bdf1dddfcd3dfaed8f7571a6dbf not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.337994 4788 scope.go:117] "RemoveContainer" containerID="eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.356397 4788 scope.go:117] "RemoveContainer" containerID="c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.377627 4788 scope.go:117] "RemoveContainer" containerID="58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.395374 4788 scope.go:117] "RemoveContainer" containerID="eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.395767 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa\": container with ID starting with eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa not found: ID does not exist" containerID="eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.395800 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa"} err="failed to get container status \"eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa\": rpc error: code = NotFound desc = could not find container \"eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa\": container with ID starting with eb01cb556b067f1b0b8c3ad19d07c5fe920cccbc76cd4a03484d7684569bf6aa not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.395827 4788 scope.go:117] "RemoveContainer" containerID="c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.396144 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa\": container with ID starting with c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa not found: ID does not exist" containerID="c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.396262 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa"} err="failed to get container status \"c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa\": rpc error: code = NotFound desc = could not find container \"c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa\": container with ID starting with c66611af0eab474f3628d92cec7f40c9296bcfc830409e442f3ede5c31b4d7fa not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.396314 4788 scope.go:117] "RemoveContainer" containerID="58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30" Feb 19 08:50:00 crc kubenswrapper[4788]: E0219 08:50:00.396711 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30\": container with ID starting with 58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30 not found: ID does not exist" containerID="58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.396756 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30"} err="failed to get container status \"58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30\": rpc error: code = NotFound desc = could not find container \"58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30\": container with ID starting with 58675e2cebfd873df0a5074dc1ae749e50ccf53da4a562cd29fc7dedc3f40a30 not found: ID does not exist" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.447261 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fx2sh"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.455098 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fx2sh"] Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.721589 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" path="/var/lib/kubelet/pods/85dfd540-d029-4a79-a997-3f2f3796b7b1/volumes" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.722299 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" path="/var/lib/kubelet/pods/90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6/volumes" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.722749 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" path="/var/lib/kubelet/pods/c3846ca6-3c9c-4f02-978e-bee6148e0ba7/volumes" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.723769 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" path="/var/lib/kubelet/pods/eaaee2a4-db49-437f-a87b-98beb5e66e91/volumes" Feb 19 08:50:00 crc kubenswrapper[4788]: I0219 08:50:00.724450 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" path="/var/lib/kubelet/pods/f727c8c6-b0d5-470e-bd9a-593b908dbef4/volumes" Feb 19 08:50:01 crc kubenswrapper[4788]: I0219 08:50:01.137669 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" event={"ID":"d2a09672-cb2b-4c7a-93f2-78a0e16d752f","Type":"ContainerStarted","Data":"1919eb04489a4d91018e4f392ccf5dacf9c18bf11d9626e9ae1fa8b62e9759a4"} Feb 19 08:50:01 crc kubenswrapper[4788]: I0219 08:50:01.137922 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:50:01 crc kubenswrapper[4788]: I0219 08:50:01.142037 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" Feb 19 08:50:01 crc kubenswrapper[4788]: I0219 08:50:01.169671 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7fl6r" podStartSLOduration=2.169654004 podStartE2EDuration="2.169654004s" podCreationTimestamp="2026-02-19 08:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:50:01.154748434 +0000 UTC m=+303.142759926" watchObservedRunningTime="2026-02-19 08:50:01.169654004 +0000 UTC m=+303.157665476" Feb 19 08:50:12 crc kubenswrapper[4788]: I0219 08:50:12.877588 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t7bpz"] Feb 19 08:50:12 crc kubenswrapper[4788]: I0219 08:50:12.878377 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" podUID="c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" containerName="controller-manager" containerID="cri-o://b9b6ab176119d897c2f9a3e5060602f1c6497159c52d9db471ff155bfeaa6cfe" gracePeriod=30 Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.005377 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl"] Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.005564 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" podUID="ec69babc-944a-4707-914f-5f1da38d6316" containerName="route-controller-manager" containerID="cri-o://6c4979e728fffe4a68cb6f93d718752956537d933cd7d6d40d17f0e933deaa5b" gracePeriod=30 Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.223497 4788 generic.go:334] "Generic (PLEG): container finished" podID="ec69babc-944a-4707-914f-5f1da38d6316" containerID="6c4979e728fffe4a68cb6f93d718752956537d933cd7d6d40d17f0e933deaa5b" exitCode=0 Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.223587 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" event={"ID":"ec69babc-944a-4707-914f-5f1da38d6316","Type":"ContainerDied","Data":"6c4979e728fffe4a68cb6f93d718752956537d933cd7d6d40d17f0e933deaa5b"} Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.227733 4788 generic.go:334] "Generic (PLEG): container finished" podID="c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" containerID="b9b6ab176119d897c2f9a3e5060602f1c6497159c52d9db471ff155bfeaa6cfe" exitCode=0 Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.227769 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" event={"ID":"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7","Type":"ContainerDied","Data":"b9b6ab176119d897c2f9a3e5060602f1c6497159c52d9db471ff155bfeaa6cfe"} Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.227786 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" event={"ID":"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7","Type":"ContainerDied","Data":"0b0ee3426058dbe9a8dd9f849b4d56bc740a08513892307e285e1bf6d93e87da"} Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.227800 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0ee3426058dbe9a8dd9f849b4d56bc740a08513892307e285e1bf6d93e87da" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.243516 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.327226 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.367254 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-client-ca\") pod \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.367334 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-config\") pod \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.367358 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9945x\" (UniqueName: \"kubernetes.io/projected/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-kube-api-access-9945x\") pod \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.367388 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-serving-cert\") pod \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.367409 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-proxy-ca-bundles\") pod \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\" (UID: \"c4f8048f-eccd-44c1-b8b7-63aace0ec7f7\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.368044 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" (UID: "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.368218 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-config" (OuterVolumeSpecName: "config") pod "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" (UID: "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.368277 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" (UID: "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.373080 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-kube-api-access-9945x" (OuterVolumeSpecName: "kube-api-access-9945x") pod "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" (UID: "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7"). InnerVolumeSpecName "kube-api-access-9945x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.373278 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" (UID: "c4f8048f-eccd-44c1-b8b7-63aace0ec7f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.468688 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-989lg\" (UniqueName: \"kubernetes.io/projected/ec69babc-944a-4707-914f-5f1da38d6316-kube-api-access-989lg\") pod \"ec69babc-944a-4707-914f-5f1da38d6316\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.468781 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec69babc-944a-4707-914f-5f1da38d6316-serving-cert\") pod \"ec69babc-944a-4707-914f-5f1da38d6316\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.468837 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-client-ca\") pod \"ec69babc-944a-4707-914f-5f1da38d6316\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.468894 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-config\") pod \"ec69babc-944a-4707-914f-5f1da38d6316\" (UID: \"ec69babc-944a-4707-914f-5f1da38d6316\") " Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.469163 4788 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.469186 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.469204 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9945x\" (UniqueName: \"kubernetes.io/projected/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-kube-api-access-9945x\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.469224 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.469240 4788 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.470138 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-config" (OuterVolumeSpecName: "config") pod "ec69babc-944a-4707-914f-5f1da38d6316" (UID: "ec69babc-944a-4707-914f-5f1da38d6316"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.470217 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec69babc-944a-4707-914f-5f1da38d6316" (UID: "ec69babc-944a-4707-914f-5f1da38d6316"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.473367 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec69babc-944a-4707-914f-5f1da38d6316-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec69babc-944a-4707-914f-5f1da38d6316" (UID: "ec69babc-944a-4707-914f-5f1da38d6316"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.473481 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec69babc-944a-4707-914f-5f1da38d6316-kube-api-access-989lg" (OuterVolumeSpecName: "kube-api-access-989lg") pod "ec69babc-944a-4707-914f-5f1da38d6316" (UID: "ec69babc-944a-4707-914f-5f1da38d6316"). InnerVolumeSpecName "kube-api-access-989lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.570963 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec69babc-944a-4707-914f-5f1da38d6316-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.571017 4788 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.571038 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec69babc-944a-4707-914f-5f1da38d6316-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:13 crc kubenswrapper[4788]: I0219 08:50:13.571056 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-989lg\" (UniqueName: \"kubernetes.io/projected/ec69babc-944a-4707-914f-5f1da38d6316-kube-api-access-989lg\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.233638 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-t7bpz" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.233663 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.233689 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl" event={"ID":"ec69babc-944a-4707-914f-5f1da38d6316","Type":"ContainerDied","Data":"e7863073ffe9e15cbb0804234f937e04176f35c0e559583e3ec67bcd98eb1527"} Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.234284 4788 scope.go:117] "RemoveContainer" containerID="6c4979e728fffe4a68cb6f93d718752956537d933cd7d6d40d17f0e933deaa5b" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.277348 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t7bpz"] Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.280687 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-t7bpz"] Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.300706 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl"] Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.307720 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-khmhl"] Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.470634 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs"] Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471001 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerName="extract-content" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471033 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerName="extract-content" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471054 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec69babc-944a-4707-914f-5f1da38d6316" containerName="route-controller-manager" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471065 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec69babc-944a-4707-914f-5f1da38d6316" containerName="route-controller-manager" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471081 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471093 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471109 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="extract-content" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471119 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="extract-content" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471134 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471144 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471157 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" containerName="marketplace-operator" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471166 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" containerName="marketplace-operator" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471178 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="extract-content" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471207 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="extract-content" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471220 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="extract-utilities" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471229 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="extract-utilities" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471240 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="extract-utilities" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471288 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="extract-utilities" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471298 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471307 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471320 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471330 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471348 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" containerName="controller-manager" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471359 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" containerName="controller-manager" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471372 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerName="extract-utilities" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471429 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerName="extract-utilities" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471451 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerName="extract-utilities" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471463 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerName="extract-utilities" Feb 19 08:50:14 crc kubenswrapper[4788]: E0219 08:50:14.471478 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerName="extract-content" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471489 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerName="extract-content" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471604 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec69babc-944a-4707-914f-5f1da38d6316" containerName="route-controller-manager" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471617 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" containerName="controller-manager" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471682 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="85dfd540-d029-4a79-a997-3f2f3796b7b1" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471693 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d01cd6-8d69-4cac-ba9b-41ae8c8c8ff6" containerName="marketplace-operator" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471705 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3846ca6-3c9c-4f02-978e-bee6148e0ba7" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471718 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaaee2a4-db49-437f-a87b-98beb5e66e91" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.471730 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f727c8c6-b0d5-470e-bd9a-593b908dbef4" containerName="registry-server" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.472417 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.473023 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr"] Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.474839 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.475054 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.475145 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.475081 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.475111 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.475621 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.477338 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.480073 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.480195 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.480321 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.480400 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.480437 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.480743 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.485625 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs"] Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.490511 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr"] Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.491238 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.584881 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-config\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.585020 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-client-ca\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.585067 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-proxy-ca-bundles\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.585111 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46be8ec0-3201-4280-8f02-e8b058d18ca2-serving-cert\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.585170 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-config\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.585214 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-client-ca\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.585301 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a981a43-0e78-4b13-8a7f-029dda4075fa-serving-cert\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.585366 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vd2\" (UniqueName: \"kubernetes.io/projected/46be8ec0-3201-4280-8f02-e8b058d18ca2-kube-api-access-c7vd2\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.585411 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txvd\" (UniqueName: \"kubernetes.io/projected/3a981a43-0e78-4b13-8a7f-029dda4075fa-kube-api-access-9txvd\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687095 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46be8ec0-3201-4280-8f02-e8b058d18ca2-serving-cert\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687168 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-config\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687208 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-client-ca\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687274 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a981a43-0e78-4b13-8a7f-029dda4075fa-serving-cert\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687346 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vd2\" (UniqueName: \"kubernetes.io/projected/46be8ec0-3201-4280-8f02-e8b058d18ca2-kube-api-access-c7vd2\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687389 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9txvd\" (UniqueName: \"kubernetes.io/projected/3a981a43-0e78-4b13-8a7f-029dda4075fa-kube-api-access-9txvd\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687435 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-config\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687479 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-client-ca\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.687516 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-proxy-ca-bundles\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.688369 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-client-ca\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.688386 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-config\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.689275 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-config\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.689375 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-proxy-ca-bundles\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.689439 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-client-ca\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.701148 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46be8ec0-3201-4280-8f02-e8b058d18ca2-serving-cert\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.701157 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a981a43-0e78-4b13-8a7f-029dda4075fa-serving-cert\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.703746 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txvd\" (UniqueName: \"kubernetes.io/projected/3a981a43-0e78-4b13-8a7f-029dda4075fa-kube-api-access-9txvd\") pod \"route-controller-manager-6d6cf6f5f4-6t4bs\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.708047 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vd2\" (UniqueName: \"kubernetes.io/projected/46be8ec0-3201-4280-8f02-e8b058d18ca2-kube-api-access-c7vd2\") pod \"controller-manager-6fcbf4d5b-2kjgr\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.724758 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f8048f-eccd-44c1-b8b7-63aace0ec7f7" path="/var/lib/kubelet/pods/c4f8048f-eccd-44c1-b8b7-63aace0ec7f7/volumes" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.725828 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec69babc-944a-4707-914f-5f1da38d6316" path="/var/lib/kubelet/pods/ec69babc-944a-4707-914f-5f1da38d6316/volumes" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.812961 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:14 crc kubenswrapper[4788]: I0219 08:50:14.832095 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.057216 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr"] Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.223057 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs"] Feb 19 08:50:15 crc kubenswrapper[4788]: W0219 08:50:15.233499 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a981a43_0e78_4b13_8a7f_029dda4075fa.slice/crio-ee75b91ec004db4fa5ba86d959ed7cabdad7402c939e3d41269333400fdeaa47 WatchSource:0}: Error finding container ee75b91ec004db4fa5ba86d959ed7cabdad7402c939e3d41269333400fdeaa47: Status 404 returned error can't find the container with id ee75b91ec004db4fa5ba86d959ed7cabdad7402c939e3d41269333400fdeaa47 Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.240538 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" event={"ID":"46be8ec0-3201-4280-8f02-e8b058d18ca2","Type":"ContainerStarted","Data":"9d56d87ace5f85363e1b4220149d695dd87f3448eceeaa41edb5c5c571004d88"} Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.240588 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" event={"ID":"46be8ec0-3201-4280-8f02-e8b058d18ca2","Type":"ContainerStarted","Data":"a26f5f66c4b0501cc990368533a25235f8b5dcc8b92527f6ea1bea139d36de4c"} Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.241856 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.243193 4788 patch_prober.go:28] interesting pod/controller-manager-6fcbf4d5b-2kjgr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.243219 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" podUID="46be8ec0-3201-4280-8f02-e8b058d18ca2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.243478 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" event={"ID":"3a981a43-0e78-4b13-8a7f-029dda4075fa","Type":"ContainerStarted","Data":"ee75b91ec004db4fa5ba86d959ed7cabdad7402c939e3d41269333400fdeaa47"} Feb 19 08:50:15 crc kubenswrapper[4788]: I0219 08:50:15.255375 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" podStartSLOduration=2.255350342 podStartE2EDuration="2.255350342s" podCreationTimestamp="2026-02-19 08:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:50:15.253585033 +0000 UTC m=+317.241596505" watchObservedRunningTime="2026-02-19 08:50:15.255350342 +0000 UTC m=+317.243361834" Feb 19 08:50:16 crc kubenswrapper[4788]: I0219 08:50:16.252631 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" event={"ID":"3a981a43-0e78-4b13-8a7f-029dda4075fa","Type":"ContainerStarted","Data":"c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964"} Feb 19 08:50:16 crc kubenswrapper[4788]: I0219 08:50:16.256510 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:16 crc kubenswrapper[4788]: I0219 08:50:16.270753 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" podStartSLOduration=3.270730463 podStartE2EDuration="3.270730463s" podCreationTimestamp="2026-02-19 08:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:50:16.268896071 +0000 UTC m=+318.256907533" watchObservedRunningTime="2026-02-19 08:50:16.270730463 +0000 UTC m=+318.258741945" Feb 19 08:50:17 crc kubenswrapper[4788]: I0219 08:50:17.258165 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:17 crc kubenswrapper[4788]: I0219 08:50:17.267084 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:50:22 crc kubenswrapper[4788]: I0219 08:50:22.140035 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:50:22 crc kubenswrapper[4788]: I0219 08:50:22.140176 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:50:32 crc kubenswrapper[4788]: I0219 08:50:32.884639 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr"] Feb 19 08:50:32 crc kubenswrapper[4788]: I0219 08:50:32.885675 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" podUID="46be8ec0-3201-4280-8f02-e8b058d18ca2" containerName="controller-manager" containerID="cri-o://9d56d87ace5f85363e1b4220149d695dd87f3448eceeaa41edb5c5c571004d88" gracePeriod=30 Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.362363 4788 generic.go:334] "Generic (PLEG): container finished" podID="46be8ec0-3201-4280-8f02-e8b058d18ca2" containerID="9d56d87ace5f85363e1b4220149d695dd87f3448eceeaa41edb5c5c571004d88" exitCode=0 Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.362427 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" event={"ID":"46be8ec0-3201-4280-8f02-e8b058d18ca2","Type":"ContainerDied","Data":"9d56d87ace5f85363e1b4220149d695dd87f3448eceeaa41edb5c5c571004d88"} Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.472371 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.545141 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-config\") pod \"46be8ec0-3201-4280-8f02-e8b058d18ca2\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.545360 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46be8ec0-3201-4280-8f02-e8b058d18ca2-serving-cert\") pod \"46be8ec0-3201-4280-8f02-e8b058d18ca2\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.545515 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-proxy-ca-bundles\") pod \"46be8ec0-3201-4280-8f02-e8b058d18ca2\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.546838 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7vd2\" (UniqueName: \"kubernetes.io/projected/46be8ec0-3201-4280-8f02-e8b058d18ca2-kube-api-access-c7vd2\") pod \"46be8ec0-3201-4280-8f02-e8b058d18ca2\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.546931 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-client-ca\") pod \"46be8ec0-3201-4280-8f02-e8b058d18ca2\" (UID: \"46be8ec0-3201-4280-8f02-e8b058d18ca2\") " Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.547797 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-client-ca" (OuterVolumeSpecName: "client-ca") pod "46be8ec0-3201-4280-8f02-e8b058d18ca2" (UID: "46be8ec0-3201-4280-8f02-e8b058d18ca2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.547846 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "46be8ec0-3201-4280-8f02-e8b058d18ca2" (UID: "46be8ec0-3201-4280-8f02-e8b058d18ca2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.548153 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-config" (OuterVolumeSpecName: "config") pod "46be8ec0-3201-4280-8f02-e8b058d18ca2" (UID: "46be8ec0-3201-4280-8f02-e8b058d18ca2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.548902 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.548947 4788 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.548969 4788 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/46be8ec0-3201-4280-8f02-e8b058d18ca2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.555622 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46be8ec0-3201-4280-8f02-e8b058d18ca2-kube-api-access-c7vd2" (OuterVolumeSpecName: "kube-api-access-c7vd2") pod "46be8ec0-3201-4280-8f02-e8b058d18ca2" (UID: "46be8ec0-3201-4280-8f02-e8b058d18ca2"). InnerVolumeSpecName "kube-api-access-c7vd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.558993 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46be8ec0-3201-4280-8f02-e8b058d18ca2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "46be8ec0-3201-4280-8f02-e8b058d18ca2" (UID: "46be8ec0-3201-4280-8f02-e8b058d18ca2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.649888 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46be8ec0-3201-4280-8f02-e8b058d18ca2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:33 crc kubenswrapper[4788]: I0219 08:50:33.649949 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7vd2\" (UniqueName: \"kubernetes.io/projected/46be8ec0-3201-4280-8f02-e8b058d18ca2-kube-api-access-c7vd2\") on node \"crc\" DevicePath \"\"" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.373989 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" event={"ID":"46be8ec0-3201-4280-8f02-e8b058d18ca2","Type":"ContainerDied","Data":"a26f5f66c4b0501cc990368533a25235f8b5dcc8b92527f6ea1bea139d36de4c"} Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.374027 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.374144 4788 scope.go:117] "RemoveContainer" containerID="9d56d87ace5f85363e1b4220149d695dd87f3448eceeaa41edb5c5c571004d88" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.417533 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr"] Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.424876 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6fcbf4d5b-2kjgr"] Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.491212 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-786c7b745d-b2rhq"] Feb 19 08:50:34 crc kubenswrapper[4788]: E0219 08:50:34.491613 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46be8ec0-3201-4280-8f02-e8b058d18ca2" containerName="controller-manager" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.491642 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="46be8ec0-3201-4280-8f02-e8b058d18ca2" containerName="controller-manager" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.491821 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="46be8ec0-3201-4280-8f02-e8b058d18ca2" containerName="controller-manager" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.492470 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.496220 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.497124 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.497397 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.497682 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.498094 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.498944 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.519948 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.550163 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-786c7b745d-b2rhq"] Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.562194 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07db1ce6-6512-4da7-b113-4002162e8698-serving-cert\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.562334 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-proxy-ca-bundles\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.562389 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gj99\" (UniqueName: \"kubernetes.io/projected/07db1ce6-6512-4da7-b113-4002162e8698-kube-api-access-2gj99\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.562600 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-config\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.562756 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-client-ca\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.664279 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-client-ca\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.664411 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07db1ce6-6512-4da7-b113-4002162e8698-serving-cert\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.664455 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-proxy-ca-bundles\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.664505 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gj99\" (UniqueName: \"kubernetes.io/projected/07db1ce6-6512-4da7-b113-4002162e8698-kube-api-access-2gj99\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.664628 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-config\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.666158 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-client-ca\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.666841 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-proxy-ca-bundles\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.667297 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07db1ce6-6512-4da7-b113-4002162e8698-config\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.679662 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07db1ce6-6512-4da7-b113-4002162e8698-serving-cert\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.697406 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gj99\" (UniqueName: \"kubernetes.io/projected/07db1ce6-6512-4da7-b113-4002162e8698-kube-api-access-2gj99\") pod \"controller-manager-786c7b745d-b2rhq\" (UID: \"07db1ce6-6512-4da7-b113-4002162e8698\") " pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.724527 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46be8ec0-3201-4280-8f02-e8b058d18ca2" path="/var/lib/kubelet/pods/46be8ec0-3201-4280-8f02-e8b058d18ca2/volumes" Feb 19 08:50:34 crc kubenswrapper[4788]: I0219 08:50:34.828659 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:35 crc kubenswrapper[4788]: I0219 08:50:35.238834 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-786c7b745d-b2rhq"] Feb 19 08:50:35 crc kubenswrapper[4788]: W0219 08:50:35.246142 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07db1ce6_6512_4da7_b113_4002162e8698.slice/crio-8fda50e18ca0e583d90aba687c224aebc50f0b39e3c3b87024137731ab0ce97b WatchSource:0}: Error finding container 8fda50e18ca0e583d90aba687c224aebc50f0b39e3c3b87024137731ab0ce97b: Status 404 returned error can't find the container with id 8fda50e18ca0e583d90aba687c224aebc50f0b39e3c3b87024137731ab0ce97b Feb 19 08:50:35 crc kubenswrapper[4788]: I0219 08:50:35.380200 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" event={"ID":"07db1ce6-6512-4da7-b113-4002162e8698","Type":"ContainerStarted","Data":"2575fe3d9781371d943619e9524573ec6867f6c190618f50770486d415c6bcbc"} Feb 19 08:50:35 crc kubenswrapper[4788]: I0219 08:50:35.380275 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" event={"ID":"07db1ce6-6512-4da7-b113-4002162e8698","Type":"ContainerStarted","Data":"8fda50e18ca0e583d90aba687c224aebc50f0b39e3c3b87024137731ab0ce97b"} Feb 19 08:50:35 crc kubenswrapper[4788]: I0219 08:50:35.381489 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:35 crc kubenswrapper[4788]: I0219 08:50:35.384319 4788 patch_prober.go:28] interesting pod/controller-manager-786c7b745d-b2rhq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Feb 19 08:50:35 crc kubenswrapper[4788]: I0219 08:50:35.384362 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" podUID="07db1ce6-6512-4da7-b113-4002162e8698" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Feb 19 08:50:36 crc kubenswrapper[4788]: I0219 08:50:36.398017 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" Feb 19 08:50:36 crc kubenswrapper[4788]: I0219 08:50:36.416898 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-786c7b745d-b2rhq" podStartSLOduration=4.416872606 podStartE2EDuration="4.416872606s" podCreationTimestamp="2026-02-19 08:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:50:35.396832435 +0000 UTC m=+337.384843917" watchObservedRunningTime="2026-02-19 08:50:36.416872606 +0000 UTC m=+338.404884118" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.139582 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.140084 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.569333 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zm4v9"] Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.570484 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.585607 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zm4v9"] Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.618849 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e41c20f-1c69-4371-a672-81feeebbe4a6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.618942 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.618990 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-registry-tls\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.619025 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-bound-sa-token\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.619066 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e41c20f-1c69-4371-a672-81feeebbe4a6-trusted-ca\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.619132 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e41c20f-1c69-4371-a672-81feeebbe4a6-registry-certificates\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.619297 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqbdh\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-kube-api-access-cqbdh\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.619337 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e41c20f-1c69-4371-a672-81feeebbe4a6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.665856 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.721203 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e41c20f-1c69-4371-a672-81feeebbe4a6-trusted-ca\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.721338 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e41c20f-1c69-4371-a672-81feeebbe4a6-registry-certificates\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.721441 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqbdh\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-kube-api-access-cqbdh\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.721500 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e41c20f-1c69-4371-a672-81feeebbe4a6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.721564 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e41c20f-1c69-4371-a672-81feeebbe4a6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.721590 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-registry-tls\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.721614 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-bound-sa-token\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.722185 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e41c20f-1c69-4371-a672-81feeebbe4a6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.722835 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e41c20f-1c69-4371-a672-81feeebbe4a6-registry-certificates\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.723990 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e41c20f-1c69-4371-a672-81feeebbe4a6-trusted-ca\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.732061 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e41c20f-1c69-4371-a672-81feeebbe4a6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.732367 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-registry-tls\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.751023 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqbdh\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-kube-api-access-cqbdh\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.755039 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e41c20f-1c69-4371-a672-81feeebbe4a6-bound-sa-token\") pod \"image-registry-66df7c8f76-zm4v9\" (UID: \"8e41c20f-1c69-4371-a672-81feeebbe4a6\") " pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:52 crc kubenswrapper[4788]: I0219 08:50:52.937706 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:53 crc kubenswrapper[4788]: I0219 08:50:53.449722 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-zm4v9"] Feb 19 08:50:53 crc kubenswrapper[4788]: I0219 08:50:53.523039 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" event={"ID":"8e41c20f-1c69-4371-a672-81feeebbe4a6","Type":"ContainerStarted","Data":"d450fe3bb440969715924f1751f6526c6b66e8ebf7e50c9db7513e1b5d6a555d"} Feb 19 08:50:54 crc kubenswrapper[4788]: I0219 08:50:54.531514 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" event={"ID":"8e41c20f-1c69-4371-a672-81feeebbe4a6","Type":"ContainerStarted","Data":"ccdde2dcc531870580acbb5dc99512583c40656a6b816356877762b1e80f5384"} Feb 19 08:50:54 crc kubenswrapper[4788]: I0219 08:50:54.531806 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:50:54 crc kubenswrapper[4788]: I0219 08:50:54.564809 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" podStartSLOduration=2.564782487 podStartE2EDuration="2.564782487s" podCreationTimestamp="2026-02-19 08:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:50:54.559567578 +0000 UTC m=+356.547579060" watchObservedRunningTime="2026-02-19 08:50:54.564782487 +0000 UTC m=+356.552793969" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.184103 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4jxxx"] Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.186865 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.189843 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.197843 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jxxx"] Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.264151 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-catalog-content\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.264224 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-utilities\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.264331 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l7vl\" (UniqueName: \"kubernetes.io/projected/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-kube-api-access-8l7vl\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.365954 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-utilities\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.366026 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l7vl\" (UniqueName: \"kubernetes.io/projected/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-kube-api-access-8l7vl\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.366087 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-catalog-content\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.366458 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-utilities\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.366555 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-catalog-content\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.377646 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hxdcj"] Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.378628 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: W0219 08:51:07.380574 4788 reflector.go:561] object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh": failed to list *v1.Secret: secrets "redhat-operators-dockercfg-ct8rh" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-marketplace": no relationship found between node 'crc' and this object Feb 19 08:51:07 crc kubenswrapper[4788]: E0219 08:51:07.380619 4788 reflector.go:158] "Unhandled Error" err="object-\"openshift-marketplace\"/\"redhat-operators-dockercfg-ct8rh\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"redhat-operators-dockercfg-ct8rh\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-marketplace\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.389622 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hxdcj"] Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.392940 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l7vl\" (UniqueName: \"kubernetes.io/projected/f19f6400-b7f1-4a05-beb6-dc7ff4e23d71-kube-api-access-8l7vl\") pod \"redhat-marketplace-4jxxx\" (UID: \"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71\") " pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.467183 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8935778-1ae8-4c0c-8189-e6240f5c2d23-utilities\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.467249 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8935778-1ae8-4c0c-8189-e6240f5c2d23-catalog-content\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.467300 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w84lk\" (UniqueName: \"kubernetes.io/projected/e8935778-1ae8-4c0c-8189-e6240f5c2d23-kube-api-access-w84lk\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.520995 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.568441 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8935778-1ae8-4c0c-8189-e6240f5c2d23-catalog-content\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.568989 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w84lk\" (UniqueName: \"kubernetes.io/projected/e8935778-1ae8-4c0c-8189-e6240f5c2d23-kube-api-access-w84lk\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.569184 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8935778-1ae8-4c0c-8189-e6240f5c2d23-utilities\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.569650 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8935778-1ae8-4c0c-8189-e6240f5c2d23-utilities\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.569675 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8935778-1ae8-4c0c-8189-e6240f5c2d23-catalog-content\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:07 crc kubenswrapper[4788]: I0219 08:51:07.604952 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w84lk\" (UniqueName: \"kubernetes.io/projected/e8935778-1ae8-4c0c-8189-e6240f5c2d23-kube-api-access-w84lk\") pod \"redhat-operators-hxdcj\" (UID: \"e8935778-1ae8-4c0c-8189-e6240f5c2d23\") " pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:08 crc kubenswrapper[4788]: I0219 08:51:08.029502 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jxxx"] Feb 19 08:51:08 crc kubenswrapper[4788]: I0219 08:51:08.624792 4788 generic.go:334] "Generic (PLEG): container finished" podID="f19f6400-b7f1-4a05-beb6-dc7ff4e23d71" containerID="a1608a0743ddc1337595cc320f5c9031a20536f405910ffa209be45b00f54429" exitCode=0 Feb 19 08:51:08 crc kubenswrapper[4788]: I0219 08:51:08.624850 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jxxx" event={"ID":"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71","Type":"ContainerDied","Data":"a1608a0743ddc1337595cc320f5c9031a20536f405910ffa209be45b00f54429"} Feb 19 08:51:08 crc kubenswrapper[4788]: I0219 08:51:08.625011 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jxxx" event={"ID":"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71","Type":"ContainerStarted","Data":"7ca329a0e910daf0aa68b5a3ca260dc6525e6bd97820fea1df5a256837f4f814"} Feb 19 08:51:08 crc kubenswrapper[4788]: I0219 08:51:08.695767 4788 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-marketplace/redhat-operators-hxdcj" secret="" err="failed to sync secret cache: timed out waiting for the condition" Feb 19 08:51:08 crc kubenswrapper[4788]: I0219 08:51:08.695859 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:08 crc kubenswrapper[4788]: I0219 08:51:08.697288 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.156495 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hxdcj"] Feb 19 08:51:09 crc kubenswrapper[4788]: W0219 08:51:09.173512 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8935778_1ae8_4c0c_8189_e6240f5c2d23.slice/crio-c8c2bf4053996804bee617d8e1dc538bd708b425a9cc0f9a27d6b632911bea2a WatchSource:0}: Error finding container c8c2bf4053996804bee617d8e1dc538bd708b425a9cc0f9a27d6b632911bea2a: Status 404 returned error can't find the container with id c8c2bf4053996804bee617d8e1dc538bd708b425a9cc0f9a27d6b632911bea2a Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.583707 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qxq4d"] Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.597186 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.602994 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.612573 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxq4d"] Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.633494 4788 generic.go:334] "Generic (PLEG): container finished" podID="e8935778-1ae8-4c0c-8189-e6240f5c2d23" containerID="38de40203a18a6f97e5f966b29517f21d8860452daff02c97b8b9d3524bc7383" exitCode=0 Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.633574 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxdcj" event={"ID":"e8935778-1ae8-4c0c-8189-e6240f5c2d23","Type":"ContainerDied","Data":"38de40203a18a6f97e5f966b29517f21d8860452daff02c97b8b9d3524bc7383"} Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.633602 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxdcj" event={"ID":"e8935778-1ae8-4c0c-8189-e6240f5c2d23","Type":"ContainerStarted","Data":"c8c2bf4053996804bee617d8e1dc538bd708b425a9cc0f9a27d6b632911bea2a"} Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.636795 4788 generic.go:334] "Generic (PLEG): container finished" podID="f19f6400-b7f1-4a05-beb6-dc7ff4e23d71" containerID="3fbaafed4e218cdb2b263b34b927e001f8ef5785cf5eb3d4396ad5652d605393" exitCode=0 Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.636864 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jxxx" event={"ID":"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71","Type":"ContainerDied","Data":"3fbaafed4e218cdb2b263b34b927e001f8ef5785cf5eb3d4396ad5652d605393"} Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.717973 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ff25313-66ec-4853-bf80-45bec9ab0ccd-catalog-content\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.718160 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phr6\" (UniqueName: \"kubernetes.io/projected/8ff25313-66ec-4853-bf80-45bec9ab0ccd-kube-api-access-5phr6\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.718220 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ff25313-66ec-4853-bf80-45bec9ab0ccd-utilities\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.778702 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b2kvp"] Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.779875 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.782370 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.800826 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2kvp"] Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.821441 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ff25313-66ec-4853-bf80-45bec9ab0ccd-utilities\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.822043 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ff25313-66ec-4853-bf80-45bec9ab0ccd-utilities\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.822179 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ff25313-66ec-4853-bf80-45bec9ab0ccd-catalog-content\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.822264 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5phr6\" (UniqueName: \"kubernetes.io/projected/8ff25313-66ec-4853-bf80-45bec9ab0ccd-kube-api-access-5phr6\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.825948 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ff25313-66ec-4853-bf80-45bec9ab0ccd-catalog-content\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.855308 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phr6\" (UniqueName: \"kubernetes.io/projected/8ff25313-66ec-4853-bf80-45bec9ab0ccd-kube-api-access-5phr6\") pod \"community-operators-qxq4d\" (UID: \"8ff25313-66ec-4853-bf80-45bec9ab0ccd\") " pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.923387 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c991ec-1b0a-4263-b45a-92a841c291f0-utilities\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.923479 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngq6s\" (UniqueName: \"kubernetes.io/projected/98c991ec-1b0a-4263-b45a-92a841c291f0-kube-api-access-ngq6s\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.923539 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c991ec-1b0a-4263-b45a-92a841c291f0-catalog-content\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:09 crc kubenswrapper[4788]: I0219 08:51:09.931063 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.028784 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c991ec-1b0a-4263-b45a-92a841c291f0-utilities\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.028850 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngq6s\" (UniqueName: \"kubernetes.io/projected/98c991ec-1b0a-4263-b45a-92a841c291f0-kube-api-access-ngq6s\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.028885 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c991ec-1b0a-4263-b45a-92a841c291f0-catalog-content\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.029528 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98c991ec-1b0a-4263-b45a-92a841c291f0-utilities\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.029540 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98c991ec-1b0a-4263-b45a-92a841c291f0-catalog-content\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.057502 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngq6s\" (UniqueName: \"kubernetes.io/projected/98c991ec-1b0a-4263-b45a-92a841c291f0-kube-api-access-ngq6s\") pod \"certified-operators-b2kvp\" (UID: \"98c991ec-1b0a-4263-b45a-92a841c291f0\") " pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.108841 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.353467 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxq4d"] Feb 19 08:51:10 crc kubenswrapper[4788]: W0219 08:51:10.364151 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ff25313_66ec_4853_bf80_45bec9ab0ccd.slice/crio-0a8e4e3fc3bd86b3c98244373daef3db8a2c2e6a8cb214ebc89cda7f28cdfeb5 WatchSource:0}: Error finding container 0a8e4e3fc3bd86b3c98244373daef3db8a2c2e6a8cb214ebc89cda7f28cdfeb5: Status 404 returned error can't find the container with id 0a8e4e3fc3bd86b3c98244373daef3db8a2c2e6a8cb214ebc89cda7f28cdfeb5 Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.557006 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b2kvp"] Feb 19 08:51:10 crc kubenswrapper[4788]: W0219 08:51:10.567856 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c991ec_1b0a_4263_b45a_92a841c291f0.slice/crio-54b95ad8235178c99ca9060bc4d69db8394d00b9dc9614a82696ff1fd80d0903 WatchSource:0}: Error finding container 54b95ad8235178c99ca9060bc4d69db8394d00b9dc9614a82696ff1fd80d0903: Status 404 returned error can't find the container with id 54b95ad8235178c99ca9060bc4d69db8394d00b9dc9614a82696ff1fd80d0903 Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.645890 4788 generic.go:334] "Generic (PLEG): container finished" podID="8ff25313-66ec-4853-bf80-45bec9ab0ccd" containerID="9c1e05effcd9f842b6f82c5ccf2013edb515b3ea30e4f721d690f1e95c4360b6" exitCode=0 Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.645937 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxq4d" event={"ID":"8ff25313-66ec-4853-bf80-45bec9ab0ccd","Type":"ContainerDied","Data":"9c1e05effcd9f842b6f82c5ccf2013edb515b3ea30e4f721d690f1e95c4360b6"} Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.646001 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxq4d" event={"ID":"8ff25313-66ec-4853-bf80-45bec9ab0ccd","Type":"ContainerStarted","Data":"0a8e4e3fc3bd86b3c98244373daef3db8a2c2e6a8cb214ebc89cda7f28cdfeb5"} Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.648392 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2kvp" event={"ID":"98c991ec-1b0a-4263-b45a-92a841c291f0","Type":"ContainerStarted","Data":"54b95ad8235178c99ca9060bc4d69db8394d00b9dc9614a82696ff1fd80d0903"} Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.653295 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jxxx" event={"ID":"f19f6400-b7f1-4a05-beb6-dc7ff4e23d71","Type":"ContainerStarted","Data":"7d5eaf2ea2ba8198fb72570b51ced9888c9d5eac81293d5bf3e74524b34a5962"} Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.654824 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxdcj" event={"ID":"e8935778-1ae8-4c0c-8189-e6240f5c2d23","Type":"ContainerStarted","Data":"c9a372aa63b2b9f34550fecada09a65dcdacdcc6a2233df79feececc358f1185"} Feb 19 08:51:10 crc kubenswrapper[4788]: I0219 08:51:10.684504 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4jxxx" podStartSLOduration=2.212097376 podStartE2EDuration="3.684488886s" podCreationTimestamp="2026-02-19 08:51:07 +0000 UTC" firstStartedPulling="2026-02-19 08:51:08.626579901 +0000 UTC m=+370.614591373" lastFinishedPulling="2026-02-19 08:51:10.098971411 +0000 UTC m=+372.086982883" observedRunningTime="2026-02-19 08:51:10.682786807 +0000 UTC m=+372.670798279" watchObservedRunningTime="2026-02-19 08:51:10.684488886 +0000 UTC m=+372.672500358" Feb 19 08:51:11 crc kubenswrapper[4788]: I0219 08:51:11.666533 4788 generic.go:334] "Generic (PLEG): container finished" podID="e8935778-1ae8-4c0c-8189-e6240f5c2d23" containerID="c9a372aa63b2b9f34550fecada09a65dcdacdcc6a2233df79feececc358f1185" exitCode=0 Feb 19 08:51:11 crc kubenswrapper[4788]: I0219 08:51:11.666611 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxdcj" event={"ID":"e8935778-1ae8-4c0c-8189-e6240f5c2d23","Type":"ContainerDied","Data":"c9a372aa63b2b9f34550fecada09a65dcdacdcc6a2233df79feececc358f1185"} Feb 19 08:51:11 crc kubenswrapper[4788]: I0219 08:51:11.668414 4788 generic.go:334] "Generic (PLEG): container finished" podID="8ff25313-66ec-4853-bf80-45bec9ab0ccd" containerID="71ffc028ced86f19222c3d41a8410d5d3a2dffba538070209ff6aacf64d1135d" exitCode=0 Feb 19 08:51:11 crc kubenswrapper[4788]: I0219 08:51:11.668572 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxq4d" event={"ID":"8ff25313-66ec-4853-bf80-45bec9ab0ccd","Type":"ContainerDied","Data":"71ffc028ced86f19222c3d41a8410d5d3a2dffba538070209ff6aacf64d1135d"} Feb 19 08:51:11 crc kubenswrapper[4788]: I0219 08:51:11.671823 4788 generic.go:334] "Generic (PLEG): container finished" podID="98c991ec-1b0a-4263-b45a-92a841c291f0" containerID="fb745099be3921ceb4f554861f980fa35664153738c197402f75b8562067fbfb" exitCode=0 Feb 19 08:51:11 crc kubenswrapper[4788]: I0219 08:51:11.673215 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2kvp" event={"ID":"98c991ec-1b0a-4263-b45a-92a841c291f0","Type":"ContainerDied","Data":"fb745099be3921ceb4f554861f980fa35664153738c197402f75b8562067fbfb"} Feb 19 08:51:12 crc kubenswrapper[4788]: I0219 08:51:12.680317 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hxdcj" event={"ID":"e8935778-1ae8-4c0c-8189-e6240f5c2d23","Type":"ContainerStarted","Data":"4ecc1a18d73ee930b664d55280ef651a5bbdbb64c030f2761b3351fde4f10b97"} Feb 19 08:51:12 crc kubenswrapper[4788]: I0219 08:51:12.682775 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxq4d" event={"ID":"8ff25313-66ec-4853-bf80-45bec9ab0ccd","Type":"ContainerStarted","Data":"88a5acdfb6c702a7f625930a55436a044a9c77582c12828e74c3915b059e76c5"} Feb 19 08:51:12 crc kubenswrapper[4788]: I0219 08:51:12.684814 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2kvp" event={"ID":"98c991ec-1b0a-4263-b45a-92a841c291f0","Type":"ContainerStarted","Data":"1b44445540cf1d975764ad3be85ad28e948402296163d9b2e1e489a6e8998bdc"} Feb 19 08:51:12 crc kubenswrapper[4788]: I0219 08:51:12.705810 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hxdcj" podStartSLOduration=3.120749675 podStartE2EDuration="5.705793121s" podCreationTimestamp="2026-02-19 08:51:07 +0000 UTC" firstStartedPulling="2026-02-19 08:51:09.635634358 +0000 UTC m=+371.623645870" lastFinishedPulling="2026-02-19 08:51:12.220677834 +0000 UTC m=+374.208689316" observedRunningTime="2026-02-19 08:51:12.701987652 +0000 UTC m=+374.689999134" watchObservedRunningTime="2026-02-19 08:51:12.705793121 +0000 UTC m=+374.693804603" Feb 19 08:51:12 crc kubenswrapper[4788]: I0219 08:51:12.719382 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qxq4d" podStartSLOduration=2.068080783 podStartE2EDuration="3.71936584s" podCreationTimestamp="2026-02-19 08:51:09 +0000 UTC" firstStartedPulling="2026-02-19 08:51:10.648062782 +0000 UTC m=+372.636074254" lastFinishedPulling="2026-02-19 08:51:12.299347839 +0000 UTC m=+374.287359311" observedRunningTime="2026-02-19 08:51:12.717429834 +0000 UTC m=+374.705441306" watchObservedRunningTime="2026-02-19 08:51:12.71936584 +0000 UTC m=+374.707377322" Feb 19 08:51:12 crc kubenswrapper[4788]: I0219 08:51:12.870741 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs"] Feb 19 08:51:12 crc kubenswrapper[4788]: I0219 08:51:12.871337 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" podUID="3a981a43-0e78-4b13-8a7f-029dda4075fa" containerName="route-controller-manager" containerID="cri-o://c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964" gracePeriod=30 Feb 19 08:51:12 crc kubenswrapper[4788]: I0219 08:51:12.943560 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-zm4v9" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.024426 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pj5df"] Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.321456 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.408013 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-client-ca\") pod \"3a981a43-0e78-4b13-8a7f-029dda4075fa\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.408133 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9txvd\" (UniqueName: \"kubernetes.io/projected/3a981a43-0e78-4b13-8a7f-029dda4075fa-kube-api-access-9txvd\") pod \"3a981a43-0e78-4b13-8a7f-029dda4075fa\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.408167 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-config\") pod \"3a981a43-0e78-4b13-8a7f-029dda4075fa\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.408262 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a981a43-0e78-4b13-8a7f-029dda4075fa-serving-cert\") pod \"3a981a43-0e78-4b13-8a7f-029dda4075fa\" (UID: \"3a981a43-0e78-4b13-8a7f-029dda4075fa\") " Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.408771 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-client-ca" (OuterVolumeSpecName: "client-ca") pod "3a981a43-0e78-4b13-8a7f-029dda4075fa" (UID: "3a981a43-0e78-4b13-8a7f-029dda4075fa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.409114 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-config" (OuterVolumeSpecName: "config") pod "3a981a43-0e78-4b13-8a7f-029dda4075fa" (UID: "3a981a43-0e78-4b13-8a7f-029dda4075fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.414045 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a981a43-0e78-4b13-8a7f-029dda4075fa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3a981a43-0e78-4b13-8a7f-029dda4075fa" (UID: "3a981a43-0e78-4b13-8a7f-029dda4075fa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.414629 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a981a43-0e78-4b13-8a7f-029dda4075fa-kube-api-access-9txvd" (OuterVolumeSpecName: "kube-api-access-9txvd") pod "3a981a43-0e78-4b13-8a7f-029dda4075fa" (UID: "3a981a43-0e78-4b13-8a7f-029dda4075fa"). InnerVolumeSpecName "kube-api-access-9txvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.510075 4788 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-client-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.510359 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9txvd\" (UniqueName: \"kubernetes.io/projected/3a981a43-0e78-4b13-8a7f-029dda4075fa-kube-api-access-9txvd\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.510425 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a981a43-0e78-4b13-8a7f-029dda4075fa-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.510493 4788 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a981a43-0e78-4b13-8a7f-029dda4075fa-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.691981 4788 generic.go:334] "Generic (PLEG): container finished" podID="3a981a43-0e78-4b13-8a7f-029dda4075fa" containerID="c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964" exitCode=0 Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.692073 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" event={"ID":"3a981a43-0e78-4b13-8a7f-029dda4075fa","Type":"ContainerDied","Data":"c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964"} Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.692179 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.692219 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs" event={"ID":"3a981a43-0e78-4b13-8a7f-029dda4075fa","Type":"ContainerDied","Data":"ee75b91ec004db4fa5ba86d959ed7cabdad7402c939e3d41269333400fdeaa47"} Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.692272 4788 scope.go:117] "RemoveContainer" containerID="c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.695086 4788 generic.go:334] "Generic (PLEG): container finished" podID="98c991ec-1b0a-4263-b45a-92a841c291f0" containerID="1b44445540cf1d975764ad3be85ad28e948402296163d9b2e1e489a6e8998bdc" exitCode=0 Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.695156 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2kvp" event={"ID":"98c991ec-1b0a-4263-b45a-92a841c291f0","Type":"ContainerDied","Data":"1b44445540cf1d975764ad3be85ad28e948402296163d9b2e1e489a6e8998bdc"} Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.732265 4788 scope.go:117] "RemoveContainer" containerID="c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964" Feb 19 08:51:13 crc kubenswrapper[4788]: E0219 08:51:13.732638 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964\": container with ID starting with c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964 not found: ID does not exist" containerID="c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.732670 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964"} err="failed to get container status \"c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964\": rpc error: code = NotFound desc = could not find container \"c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964\": container with ID starting with c8d36a3798f6ec80be74d3439fcb330c894f208b897f2ac6077121c6619d2964 not found: ID does not exist" Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.761387 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs"] Feb 19 08:51:13 crc kubenswrapper[4788]: I0219 08:51:13.771126 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6cf6f5f4-6t4bs"] Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.517973 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj"] Feb 19 08:51:14 crc kubenswrapper[4788]: E0219 08:51:14.518811 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a981a43-0e78-4b13-8a7f-029dda4075fa" containerName="route-controller-manager" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.518826 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a981a43-0e78-4b13-8a7f-029dda4075fa" containerName="route-controller-manager" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.518968 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a981a43-0e78-4b13-8a7f-029dda4075fa" containerName="route-controller-manager" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.519440 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.521225 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.522127 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.522537 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.522974 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.523193 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.530074 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.539796 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj"] Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.622703 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-config\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.622781 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-serving-cert\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.622803 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8496l\" (UniqueName: \"kubernetes.io/projected/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-kube-api-access-8496l\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.622833 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-client-ca\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.704655 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b2kvp" event={"ID":"98c991ec-1b0a-4263-b45a-92a841c291f0","Type":"ContainerStarted","Data":"3878d4d4a6f8a434f4e6bb0cafb43dac63a72045bee3e427e4699128b84b4f91"} Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.723312 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-client-ca\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.723372 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-config\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.723519 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-serving-cert\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.723537 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8496l\" (UniqueName: \"kubernetes.io/projected/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-kube-api-access-8496l\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.724142 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a981a43-0e78-4b13-8a7f-029dda4075fa" path="/var/lib/kubelet/pods/3a981a43-0e78-4b13-8a7f-029dda4075fa/volumes" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.725029 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-client-ca\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.726796 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-config\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.740146 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b2kvp" podStartSLOduration=3.225275727 podStartE2EDuration="5.74012297s" podCreationTimestamp="2026-02-19 08:51:09 +0000 UTC" firstStartedPulling="2026-02-19 08:51:11.675368842 +0000 UTC m=+373.663380344" lastFinishedPulling="2026-02-19 08:51:14.190216115 +0000 UTC m=+376.178227587" observedRunningTime="2026-02-19 08:51:14.737443693 +0000 UTC m=+376.725455185" watchObservedRunningTime="2026-02-19 08:51:14.74012297 +0000 UTC m=+376.728134442" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.742495 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-serving-cert\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.752973 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8496l\" (UniqueName: \"kubernetes.io/projected/b6a0cfe3-408b-49f4-b75b-f698b1354bc1-kube-api-access-8496l\") pod \"route-controller-manager-765446c856-5zjlj\" (UID: \"b6a0cfe3-408b-49f4-b75b-f698b1354bc1\") " pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:14 crc kubenswrapper[4788]: I0219 08:51:14.835408 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:15 crc kubenswrapper[4788]: I0219 08:51:15.255061 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj"] Feb 19 08:51:15 crc kubenswrapper[4788]: W0219 08:51:15.267534 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6a0cfe3_408b_49f4_b75b_f698b1354bc1.slice/crio-05e2201b36c7dc32237a560ebf8415a3d1d7283f6e1b0000f4169d205502eede WatchSource:0}: Error finding container 05e2201b36c7dc32237a560ebf8415a3d1d7283f6e1b0000f4169d205502eede: Status 404 returned error can't find the container with id 05e2201b36c7dc32237a560ebf8415a3d1d7283f6e1b0000f4169d205502eede Feb 19 08:51:15 crc kubenswrapper[4788]: I0219 08:51:15.725736 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" event={"ID":"b6a0cfe3-408b-49f4-b75b-f698b1354bc1","Type":"ContainerStarted","Data":"fde431d667af3a6a4c59681eb16d58be3d97176baee1f914eebec6dd22f719ab"} Feb 19 08:51:15 crc kubenswrapper[4788]: I0219 08:51:15.726170 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:15 crc kubenswrapper[4788]: I0219 08:51:15.726186 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" event={"ID":"b6a0cfe3-408b-49f4-b75b-f698b1354bc1","Type":"ContainerStarted","Data":"05e2201b36c7dc32237a560ebf8415a3d1d7283f6e1b0000f4169d205502eede"} Feb 19 08:51:16 crc kubenswrapper[4788]: I0219 08:51:15.748255 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" podStartSLOduration=3.748227648 podStartE2EDuration="3.748227648s" podCreationTimestamp="2026-02-19 08:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:51:15.7475684 +0000 UTC m=+377.735579872" watchObservedRunningTime="2026-02-19 08:51:15.748227648 +0000 UTC m=+377.736239120" Feb 19 08:51:16 crc kubenswrapper[4788]: I0219 08:51:16.200342 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-765446c856-5zjlj" Feb 19 08:51:17 crc kubenswrapper[4788]: I0219 08:51:17.521893 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:17 crc kubenswrapper[4788]: I0219 08:51:17.521958 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:17 crc kubenswrapper[4788]: I0219 08:51:17.566077 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:17 crc kubenswrapper[4788]: I0219 08:51:17.774327 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4jxxx" Feb 19 08:51:18 crc kubenswrapper[4788]: I0219 08:51:18.696418 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:18 crc kubenswrapper[4788]: I0219 08:51:18.697311 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:19 crc kubenswrapper[4788]: I0219 08:51:19.755732 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hxdcj" podUID="e8935778-1ae8-4c0c-8189-e6240f5c2d23" containerName="registry-server" probeResult="failure" output=< Feb 19 08:51:19 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 08:51:19 crc kubenswrapper[4788]: > Feb 19 08:51:19 crc kubenswrapper[4788]: I0219 08:51:19.932124 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:19 crc kubenswrapper[4788]: I0219 08:51:19.932722 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:19 crc kubenswrapper[4788]: I0219 08:51:19.981506 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:20 crc kubenswrapper[4788]: I0219 08:51:20.109856 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:20 crc kubenswrapper[4788]: I0219 08:51:20.110305 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:20 crc kubenswrapper[4788]: I0219 08:51:20.155777 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:20 crc kubenswrapper[4788]: I0219 08:51:20.817240 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qxq4d" Feb 19 08:51:20 crc kubenswrapper[4788]: I0219 08:51:20.825088 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b2kvp" Feb 19 08:51:22 crc kubenswrapper[4788]: I0219 08:51:22.139842 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:51:22 crc kubenswrapper[4788]: I0219 08:51:22.140177 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:51:22 crc kubenswrapper[4788]: I0219 08:51:22.140230 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:51:22 crc kubenswrapper[4788]: I0219 08:51:22.142120 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"487f4bc93be66363f5fd6689b686bc27a3cdc3fd662d9f911cdcd094586d7699"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 08:51:22 crc kubenswrapper[4788]: I0219 08:51:22.142222 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://487f4bc93be66363f5fd6689b686bc27a3cdc3fd662d9f911cdcd094586d7699" gracePeriod=600 Feb 19 08:51:23 crc kubenswrapper[4788]: I0219 08:51:23.769073 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="487f4bc93be66363f5fd6689b686bc27a3cdc3fd662d9f911cdcd094586d7699" exitCode=0 Feb 19 08:51:23 crc kubenswrapper[4788]: I0219 08:51:23.769170 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"487f4bc93be66363f5fd6689b686bc27a3cdc3fd662d9f911cdcd094586d7699"} Feb 19 08:51:23 crc kubenswrapper[4788]: I0219 08:51:23.769703 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"2ddcd6e3366811879446e0480e0ef6c2ddb99413bd8f64b1d4a29ea61b1f6c94"} Feb 19 08:51:23 crc kubenswrapper[4788]: I0219 08:51:23.769737 4788 scope.go:117] "RemoveContainer" containerID="ff13e3768858651d4fa77944c8d0b0fb3b98c1f36f4d07af39568b46adf37e96" Feb 19 08:51:28 crc kubenswrapper[4788]: I0219 08:51:28.776111 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:28 crc kubenswrapper[4788]: I0219 08:51:28.837914 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hxdcj" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.066242 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" podUID="89374177-14ef-4b9a-938a-a838d6d0aab1" containerName="registry" containerID="cri-o://88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322" gracePeriod=30 Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.544295 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.607604 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89374177-14ef-4b9a-938a-a838d6d0aab1-ca-trust-extracted\") pod \"89374177-14ef-4b9a-938a-a838d6d0aab1\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.607678 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-bound-sa-token\") pod \"89374177-14ef-4b9a-938a-a838d6d0aab1\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.607725 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkdn9\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-kube-api-access-lkdn9\") pod \"89374177-14ef-4b9a-938a-a838d6d0aab1\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.607748 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-tls\") pod \"89374177-14ef-4b9a-938a-a838d6d0aab1\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.608181 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"89374177-14ef-4b9a-938a-a838d6d0aab1\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.608223 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-certificates\") pod \"89374177-14ef-4b9a-938a-a838d6d0aab1\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.608282 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-trusted-ca\") pod \"89374177-14ef-4b9a-938a-a838d6d0aab1\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.608333 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89374177-14ef-4b9a-938a-a838d6d0aab1-installation-pull-secrets\") pod \"89374177-14ef-4b9a-938a-a838d6d0aab1\" (UID: \"89374177-14ef-4b9a-938a-a838d6d0aab1\") " Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.614108 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "89374177-14ef-4b9a-938a-a838d6d0aab1" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.614746 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "89374177-14ef-4b9a-938a-a838d6d0aab1" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.618703 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89374177-14ef-4b9a-938a-a838d6d0aab1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "89374177-14ef-4b9a-938a-a838d6d0aab1" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.621458 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-kube-api-access-lkdn9" (OuterVolumeSpecName: "kube-api-access-lkdn9") pod "89374177-14ef-4b9a-938a-a838d6d0aab1" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1"). InnerVolumeSpecName "kube-api-access-lkdn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.622154 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "89374177-14ef-4b9a-938a-a838d6d0aab1" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.626669 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "89374177-14ef-4b9a-938a-a838d6d0aab1" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.628447 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "89374177-14ef-4b9a-938a-a838d6d0aab1" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.645749 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89374177-14ef-4b9a-938a-a838d6d0aab1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "89374177-14ef-4b9a-938a-a838d6d0aab1" (UID: "89374177-14ef-4b9a-938a-a838d6d0aab1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.710519 4788 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/89374177-14ef-4b9a-938a-a838d6d0aab1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.710667 4788 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/89374177-14ef-4b9a-938a-a838d6d0aab1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.710688 4788 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.710708 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkdn9\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-kube-api-access-lkdn9\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.710730 4788 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.710749 4788 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.710767 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/89374177-14ef-4b9a-938a-a838d6d0aab1-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.880316 4788 generic.go:334] "Generic (PLEG): container finished" podID="89374177-14ef-4b9a-938a-a838d6d0aab1" containerID="88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322" exitCode=0 Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.880361 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" event={"ID":"89374177-14ef-4b9a-938a-a838d6d0aab1","Type":"ContainerDied","Data":"88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322"} Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.880391 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" event={"ID":"89374177-14ef-4b9a-938a-a838d6d0aab1","Type":"ContainerDied","Data":"fa782e474278ecdf926d593727c42416fcc76e0e96c5e2c9859d654415dd801c"} Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.880411 4788 scope.go:117] "RemoveContainer" containerID="88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.880445 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.909810 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pj5df"] Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.914425 4788 scope.go:117] "RemoveContainer" containerID="88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322" Feb 19 08:51:38 crc kubenswrapper[4788]: E0219 08:51:38.915070 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322\": container with ID starting with 88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322 not found: ID does not exist" containerID="88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.915143 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322"} err="failed to get container status \"88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322\": rpc error: code = NotFound desc = could not find container \"88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322\": container with ID starting with 88a729e390e65a7aeda76aeef9cb3050b6e8f826134a6d12bd48e2b10a640322 not found: ID does not exist" Feb 19 08:51:38 crc kubenswrapper[4788]: I0219 08:51:38.917325 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-pj5df"] Feb 19 08:51:40 crc kubenswrapper[4788]: I0219 08:51:40.723701 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89374177-14ef-4b9a-938a-a838d6d0aab1" path="/var/lib/kubelet/pods/89374177-14ef-4b9a-938a-a838d6d0aab1/volumes" Feb 19 08:51:43 crc kubenswrapper[4788]: I0219 08:51:43.418837 4788 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-pj5df container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.25:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 19 08:51:43 crc kubenswrapper[4788]: I0219 08:51:43.419315 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-pj5df" podUID="89374177-14ef-4b9a-938a-a838d6d0aab1" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.25:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 19 08:53:52 crc kubenswrapper[4788]: I0219 08:53:52.139043 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:53:52 crc kubenswrapper[4788]: I0219 08:53:52.139778 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:53:58 crc kubenswrapper[4788]: I0219 08:53:58.996950 4788 scope.go:117] "RemoveContainer" containerID="b9b6ab176119d897c2f9a3e5060602f1c6497159c52d9db471ff155bfeaa6cfe" Feb 19 08:54:22 crc kubenswrapper[4788]: I0219 08:54:22.139893 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:54:22 crc kubenswrapper[4788]: I0219 08:54:22.140676 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:54:52 crc kubenswrapper[4788]: I0219 08:54:52.139553 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:54:52 crc kubenswrapper[4788]: I0219 08:54:52.140165 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:54:52 crc kubenswrapper[4788]: I0219 08:54:52.140213 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:54:52 crc kubenswrapper[4788]: I0219 08:54:52.141032 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ddcd6e3366811879446e0480e0ef6c2ddb99413bd8f64b1d4a29ea61b1f6c94"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 08:54:52 crc kubenswrapper[4788]: I0219 08:54:52.141111 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://2ddcd6e3366811879446e0480e0ef6c2ddb99413bd8f64b1d4a29ea61b1f6c94" gracePeriod=600 Feb 19 08:54:53 crc kubenswrapper[4788]: I0219 08:54:53.212908 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="2ddcd6e3366811879446e0480e0ef6c2ddb99413bd8f64b1d4a29ea61b1f6c94" exitCode=0 Feb 19 08:54:53 crc kubenswrapper[4788]: I0219 08:54:53.212969 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"2ddcd6e3366811879446e0480e0ef6c2ddb99413bd8f64b1d4a29ea61b1f6c94"} Feb 19 08:54:53 crc kubenswrapper[4788]: I0219 08:54:53.213634 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"7e35f3f513564bff5ea2198c67409383ba481d995f59e4d674440d785743deb5"} Feb 19 08:54:53 crc kubenswrapper[4788]: I0219 08:54:53.213668 4788 scope.go:117] "RemoveContainer" containerID="487f4bc93be66363f5fd6689b686bc27a3cdc3fd662d9f911cdcd094586d7699" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.205336 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qnl27"] Feb 19 08:55:06 crc kubenswrapper[4788]: E0219 08:55:06.206113 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89374177-14ef-4b9a-938a-a838d6d0aab1" containerName="registry" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.206128 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="89374177-14ef-4b9a-938a-a838d6d0aab1" containerName="registry" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.206339 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="89374177-14ef-4b9a-938a-a838d6d0aab1" containerName="registry" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.206839 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qnl27" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.208861 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qnl27"] Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.208987 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.209105 4788 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-kr59b" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.209292 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.215847 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-2sfnb"] Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.217032 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-2sfnb" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.222843 4788 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-pctlm" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.240417 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-2sfnb"] Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.245618 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-t5hxr"] Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.246422 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.250461 4788 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-m92w2" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.254629 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-t5hxr"] Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.324162 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrqnv\" (UniqueName: \"kubernetes.io/projected/14962c5c-80cd-4aa8-918b-902a3853e50c-kube-api-access-qrqnv\") pod \"cert-manager-858654f9db-2sfnb\" (UID: \"14962c5c-80cd-4aa8-918b-902a3853e50c\") " pod="cert-manager/cert-manager-858654f9db-2sfnb" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.324334 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv42b\" (UniqueName: \"kubernetes.io/projected/75272a75-157a-4506-98cb-d6fe9ff79580-kube-api-access-mv42b\") pod \"cert-manager-webhook-687f57d79b-t5hxr\" (UID: \"75272a75-157a-4506-98cb-d6fe9ff79580\") " pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.324456 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95j48\" (UniqueName: \"kubernetes.io/projected/d755f7fa-b68d-421f-b4b5-c25a4ba5af59-kube-api-access-95j48\") pod \"cert-manager-cainjector-cf98fcc89-qnl27\" (UID: \"d755f7fa-b68d-421f-b4b5-c25a4ba5af59\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qnl27" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.425436 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrqnv\" (UniqueName: \"kubernetes.io/projected/14962c5c-80cd-4aa8-918b-902a3853e50c-kube-api-access-qrqnv\") pod \"cert-manager-858654f9db-2sfnb\" (UID: \"14962c5c-80cd-4aa8-918b-902a3853e50c\") " pod="cert-manager/cert-manager-858654f9db-2sfnb" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.425507 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv42b\" (UniqueName: \"kubernetes.io/projected/75272a75-157a-4506-98cb-d6fe9ff79580-kube-api-access-mv42b\") pod \"cert-manager-webhook-687f57d79b-t5hxr\" (UID: \"75272a75-157a-4506-98cb-d6fe9ff79580\") " pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.425549 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95j48\" (UniqueName: \"kubernetes.io/projected/d755f7fa-b68d-421f-b4b5-c25a4ba5af59-kube-api-access-95j48\") pod \"cert-manager-cainjector-cf98fcc89-qnl27\" (UID: \"d755f7fa-b68d-421f-b4b5-c25a4ba5af59\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qnl27" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.442805 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95j48\" (UniqueName: \"kubernetes.io/projected/d755f7fa-b68d-421f-b4b5-c25a4ba5af59-kube-api-access-95j48\") pod \"cert-manager-cainjector-cf98fcc89-qnl27\" (UID: \"d755f7fa-b68d-421f-b4b5-c25a4ba5af59\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-qnl27" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.443528 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv42b\" (UniqueName: \"kubernetes.io/projected/75272a75-157a-4506-98cb-d6fe9ff79580-kube-api-access-mv42b\") pod \"cert-manager-webhook-687f57d79b-t5hxr\" (UID: \"75272a75-157a-4506-98cb-d6fe9ff79580\") " pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.444199 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrqnv\" (UniqueName: \"kubernetes.io/projected/14962c5c-80cd-4aa8-918b-902a3853e50c-kube-api-access-qrqnv\") pod \"cert-manager-858654f9db-2sfnb\" (UID: \"14962c5c-80cd-4aa8-918b-902a3853e50c\") " pod="cert-manager/cert-manager-858654f9db-2sfnb" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.530327 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qnl27" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.534424 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-2sfnb" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.565304 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.780607 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-qnl27"] Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.790220 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.817388 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-t5hxr"] Feb 19 08:55:06 crc kubenswrapper[4788]: W0219 08:55:06.825511 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75272a75_157a_4506_98cb_d6fe9ff79580.slice/crio-ff5e849b5c009130d38d3f2de2aeab5f274362c1abdcbf12249edc54efbbcccb WatchSource:0}: Error finding container ff5e849b5c009130d38d3f2de2aeab5f274362c1abdcbf12249edc54efbbcccb: Status 404 returned error can't find the container with id ff5e849b5c009130d38d3f2de2aeab5f274362c1abdcbf12249edc54efbbcccb Feb 19 08:55:06 crc kubenswrapper[4788]: I0219 08:55:06.859373 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-2sfnb"] Feb 19 08:55:06 crc kubenswrapper[4788]: W0219 08:55:06.865997 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14962c5c_80cd_4aa8_918b_902a3853e50c.slice/crio-7da5f341b41c2b1d527e135ce08a79fc147d5fe1f8af0559263ad53276233768 WatchSource:0}: Error finding container 7da5f341b41c2b1d527e135ce08a79fc147d5fe1f8af0559263ad53276233768: Status 404 returned error can't find the container with id 7da5f341b41c2b1d527e135ce08a79fc147d5fe1f8af0559263ad53276233768 Feb 19 08:55:07 crc kubenswrapper[4788]: I0219 08:55:07.301480 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qnl27" event={"ID":"d755f7fa-b68d-421f-b4b5-c25a4ba5af59","Type":"ContainerStarted","Data":"8af9f6b6d10f2d35a8f079c6af91006cd9af9a058a20900a4e841b02eeaacab1"} Feb 19 08:55:07 crc kubenswrapper[4788]: I0219 08:55:07.303403 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" event={"ID":"75272a75-157a-4506-98cb-d6fe9ff79580","Type":"ContainerStarted","Data":"ff5e849b5c009130d38d3f2de2aeab5f274362c1abdcbf12249edc54efbbcccb"} Feb 19 08:55:07 crc kubenswrapper[4788]: I0219 08:55:07.304883 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-2sfnb" event={"ID":"14962c5c-80cd-4aa8-918b-902a3853e50c","Type":"ContainerStarted","Data":"7da5f341b41c2b1d527e135ce08a79fc147d5fe1f8af0559263ad53276233768"} Feb 19 08:55:10 crc kubenswrapper[4788]: I0219 08:55:10.322526 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-2sfnb" event={"ID":"14962c5c-80cd-4aa8-918b-902a3853e50c","Type":"ContainerStarted","Data":"166f83b965c1348b9e5265735471b04eadf38f9153b30a755f9204e21ba84dc1"} Feb 19 08:55:10 crc kubenswrapper[4788]: I0219 08:55:10.337556 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-2sfnb" podStartSLOduration=1.4132190310000001 podStartE2EDuration="4.33753985s" podCreationTimestamp="2026-02-19 08:55:06 +0000 UTC" firstStartedPulling="2026-02-19 08:55:06.868062819 +0000 UTC m=+608.856074311" lastFinishedPulling="2026-02-19 08:55:09.792383618 +0000 UTC m=+611.780395130" observedRunningTime="2026-02-19 08:55:10.337476469 +0000 UTC m=+612.325487951" watchObservedRunningTime="2026-02-19 08:55:10.33753985 +0000 UTC m=+612.325551322" Feb 19 08:55:11 crc kubenswrapper[4788]: I0219 08:55:11.331347 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" event={"ID":"75272a75-157a-4506-98cb-d6fe9ff79580","Type":"ContainerStarted","Data":"2466d51590e42d0a18d36faaed4fd01e7c20139b6a1ef00bfc4d4ed707addd1f"} Feb 19 08:55:11 crc kubenswrapper[4788]: I0219 08:55:11.331690 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" Feb 19 08:55:11 crc kubenswrapper[4788]: I0219 08:55:11.334928 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qnl27" event={"ID":"d755f7fa-b68d-421f-b4b5-c25a4ba5af59","Type":"ContainerStarted","Data":"c13e2e82bbbba0f77f5521f88627f4550619371caaa5007ccefedeccee080b4a"} Feb 19 08:55:11 crc kubenswrapper[4788]: I0219 08:55:11.352269 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" podStartSLOduration=1.983083847 podStartE2EDuration="5.3522219s" podCreationTimestamp="2026-02-19 08:55:06 +0000 UTC" firstStartedPulling="2026-02-19 08:55:06.8269278 +0000 UTC m=+608.814939272" lastFinishedPulling="2026-02-19 08:55:10.196065843 +0000 UTC m=+612.184077325" observedRunningTime="2026-02-19 08:55:11.351627097 +0000 UTC m=+613.339638609" watchObservedRunningTime="2026-02-19 08:55:11.3522219 +0000 UTC m=+613.340233392" Feb 19 08:55:11 crc kubenswrapper[4788]: I0219 08:55:11.376009 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-qnl27" podStartSLOduration=1.7754108579999999 podStartE2EDuration="5.375981043s" podCreationTimestamp="2026-02-19 08:55:06 +0000 UTC" firstStartedPulling="2026-02-19 08:55:06.789991321 +0000 UTC m=+608.778002793" lastFinishedPulling="2026-02-19 08:55:10.390561476 +0000 UTC m=+612.378572978" observedRunningTime="2026-02-19 08:55:11.368387709 +0000 UTC m=+613.356399221" watchObservedRunningTime="2026-02-19 08:55:11.375981043 +0000 UTC m=+613.363992545" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.300181 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xmshh"] Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.301368 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovn-controller" containerID="cri-o://5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400" gracePeriod=30 Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.301450 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="nbdb" containerID="cri-o://81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd" gracePeriod=30 Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.301538 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="northd" containerID="cri-o://6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740" gracePeriod=30 Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.301598 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="sbdb" containerID="cri-o://46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5" gracePeriod=30 Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.301697 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovn-acl-logging" containerID="cri-o://85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429" gracePeriod=30 Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.301701 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f" gracePeriod=30 Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.301724 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kube-rbac-proxy-node" containerID="cri-o://3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6" gracePeriod=30 Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.327743 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" containerID="cri-o://9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" gracePeriod=30 Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.568913 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-t5hxr" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.574602 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd5c1c46_74a4_41f4_ad05_af438781bd6a.slice/crio-6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd5c1c46_74a4_41f4_ad05_af438781bd6a.slice/crio-conmon-6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740.scope\": RecentStats: unable to find data in memory cache]" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.670455 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/3.log" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.673329 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovn-acl-logging/0.log" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.674655 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovn-controller/0.log" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.675209 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744039 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lvgws"] Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744384 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kube-rbac-proxy-node" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744404 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kube-rbac-proxy-node" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744415 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="sbdb" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744423 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="sbdb" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744433 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kubecfg-setup" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744441 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kubecfg-setup" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744455 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="nbdb" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744462 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="nbdb" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744475 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="northd" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744483 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="northd" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744493 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744500 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744513 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovn-acl-logging" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744522 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovn-acl-logging" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744536 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744547 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744562 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744570 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744579 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744586 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744597 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744605 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744612 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovn-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744619 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovn-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744770 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="northd" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744788 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744797 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="sbdb" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744804 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovn-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744815 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744822 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovn-acl-logging" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744829 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744838 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kube-rbac-proxy-ovn-metrics" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744846 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="nbdb" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744856 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="kube-rbac-proxy-node" Feb 19 08:55:16 crc kubenswrapper[4788]: E0219 08:55:16.744968 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.744975 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.745091 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.745102 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerName="ovnkube-controller" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.748459 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791313 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791372 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-etc-openvswitch\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791412 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-systemd-units\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791442 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-log-socket\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791477 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovn-node-metrics-cert\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791508 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-config\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791528 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-env-overrides\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791567 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-bin\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791585 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-ovn-kubernetes\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791607 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-kubelet\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791622 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6gjm\" (UniqueName: \"kubernetes.io/projected/fd5c1c46-74a4-41f4-ad05-af438781bd6a-kube-api-access-f6gjm\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791638 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-node-log\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791656 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-netd\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791672 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-ovn\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791688 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-openvswitch\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791700 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-netns\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791746 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-script-lib\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791766 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-systemd\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791788 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-slash\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.791805 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-var-lib-openvswitch\") pod \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\" (UID: \"fd5c1c46-74a4-41f4-ad05-af438781bd6a\") " Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.792050 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.792081 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.792103 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.792119 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.792136 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-log-socket" (OuterVolumeSpecName: "log-socket") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.792676 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-node-log" (OuterVolumeSpecName: "node-log") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.792981 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.793065 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.793119 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.793185 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.793221 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.793696 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.793784 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.793956 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.794079 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.799458 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.799617 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-slash" (OuterVolumeSpecName: "host-slash") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.802827 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5c1c46-74a4-41f4-ad05-af438781bd6a-kube-api-access-f6gjm" (OuterVolumeSpecName: "kube-api-access-f6gjm") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "kube-api-access-f6gjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.805853 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.825471 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "fd5c1c46-74a4-41f4-ad05-af438781bd6a" (UID: "fd5c1c46-74a4-41f4-ad05-af438781bd6a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893039 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-ovn\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893095 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-cni-bin\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893118 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-var-lib-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893135 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-systemd-units\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893151 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-node-log\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893174 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-etc-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893190 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-log-socket\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893203 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-cni-netd\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893235 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca64b06d-2068-4946-b362-587da5f56a0c-ovn-node-metrics-cert\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893271 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-ovnkube-script-lib\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893289 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893312 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-slash\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893328 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-run-netns\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893348 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893382 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t2dc\" (UniqueName: \"kubernetes.io/projected/ca64b06d-2068-4946-b362-587da5f56a0c-kube-api-access-6t2dc\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893407 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893438 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-ovnkube-config\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893459 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-env-overrides\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893474 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-systemd\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893494 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-kubelet\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893533 4788 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893544 4788 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893554 4788 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893563 4788 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-log-socket\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893571 4788 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893579 4788 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893588 4788 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893596 4788 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893606 4788 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893616 4788 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893624 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6gjm\" (UniqueName: \"kubernetes.io/projected/fd5c1c46-74a4-41f4-ad05-af438781bd6a-kube-api-access-f6gjm\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893632 4788 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-node-log\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893640 4788 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893648 4788 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893655 4788 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893665 4788 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893673 4788 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd5c1c46-74a4-41f4-ad05-af438781bd6a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893683 4788 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893690 4788 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-host-slash\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.893698 4788 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd5c1c46-74a4-41f4-ad05-af438781bd6a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.994705 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-ovnkube-config\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.994782 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-env-overrides\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.994835 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-systemd\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.994893 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-kubelet\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.994951 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-ovn\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.994998 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-cni-bin\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995043 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-systemd-units\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995087 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-var-lib-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995129 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-node-log\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995194 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-etc-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995307 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-log-socket\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995352 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-cni-netd\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995400 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca64b06d-2068-4946-b362-587da5f56a0c-ovn-node-metrics-cert\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995445 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-ovnkube-script-lib\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995491 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995557 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-slash\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995559 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-env-overrides\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995601 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-run-netns\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995630 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-node-log\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995648 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995676 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-systemd\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995708 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-ovn\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995730 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-cni-bin\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995732 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-ovnkube-config\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995748 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-kubelet\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995756 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-systemd-units\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995773 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-var-lib-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995792 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-run-netns\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995814 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995814 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-etc-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995839 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-slash\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995866 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-log-socket\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995881 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-run-openvswitch\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995903 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-cni-netd\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.995981 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t2dc\" (UniqueName: \"kubernetes.io/projected/ca64b06d-2068-4946-b362-587da5f56a0c-kube-api-access-6t2dc\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.996017 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.996111 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca64b06d-2068-4946-b362-587da5f56a0c-host-run-ovn-kubernetes\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:16 crc kubenswrapper[4788]: I0219 08:55:16.996446 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca64b06d-2068-4946-b362-587da5f56a0c-ovnkube-script-lib\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.000266 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca64b06d-2068-4946-b362-587da5f56a0c-ovn-node-metrics-cert\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.015382 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t2dc\" (UniqueName: \"kubernetes.io/projected/ca64b06d-2068-4946-b362-587da5f56a0c-kube-api-access-6t2dc\") pod \"ovnkube-node-lvgws\" (UID: \"ca64b06d-2068-4946-b362-587da5f56a0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.067117 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:17 crc kubenswrapper[4788]: W0219 08:55:17.095893 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca64b06d_2068_4946_b362_587da5f56a0c.slice/crio-93db155013fbe25e179b4ba9a2318cc907b01639dda5644884db70518a47533c WatchSource:0}: Error finding container 93db155013fbe25e179b4ba9a2318cc907b01639dda5644884db70518a47533c: Status 404 returned error can't find the container with id 93db155013fbe25e179b4ba9a2318cc907b01639dda5644884db70518a47533c Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.378368 4788 generic.go:334] "Generic (PLEG): container finished" podID="ca64b06d-2068-4946-b362-587da5f56a0c" containerID="ccabf6202be2ef9bcdce3d7ab598e0f42e21f3420b7e9b44f929563e4785e6c1" exitCode=0 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.378462 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerDied","Data":"ccabf6202be2ef9bcdce3d7ab598e0f42e21f3420b7e9b44f929563e4785e6c1"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.378732 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"93db155013fbe25e179b4ba9a2318cc907b01639dda5644884db70518a47533c"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.384164 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovnkube-controller/3.log" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.388428 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovn-acl-logging/0.log" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389009 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xmshh_fd5c1c46-74a4-41f4-ad05-af438781bd6a/ovn-controller/0.log" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389738 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" exitCode=0 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389774 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5" exitCode=0 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389814 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389857 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389871 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389886 4788 scope.go:117] "RemoveContainer" containerID="9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389906 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389821 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd" exitCode=0 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389967 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740" exitCode=0 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389977 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f" exitCode=0 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389986 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6" exitCode=0 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.389994 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429" exitCode=143 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390002 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" containerID="5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400" exitCode=143 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390028 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390039 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390048 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390058 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390068 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390074 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390079 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390084 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390089 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390093 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390098 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390103 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390110 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390119 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390125 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390130 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390135 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390140 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390145 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390149 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390154 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390159 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390165 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390173 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390183 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390191 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390197 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390204 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390209 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390214 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390220 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390225 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390230 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390235 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390256 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xmshh" event={"ID":"fd5c1c46-74a4-41f4-ad05-af438781bd6a","Type":"ContainerDied","Data":"58f1537e6e76b51681c65216d8d7d4364812d5e6e56fb239d3392181a50f22c0"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390264 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390270 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390275 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390280 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390284 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390290 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390295 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390300 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390305 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.390310 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.393203 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/2.log" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.393807 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/1.log" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.393839 4788 generic.go:334] "Generic (PLEG): container finished" podID="a5c26787-29de-439a-86b8-920cac6c8ab8" containerID="928812eee9cd494b61b87304c4ab4d58ffd651e9468a83240fec350eb56ec947" exitCode=2 Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.393858 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hxf6" event={"ID":"a5c26787-29de-439a-86b8-920cac6c8ab8","Type":"ContainerDied","Data":"928812eee9cd494b61b87304c4ab4d58ffd651e9468a83240fec350eb56ec947"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.393872 4788 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d"} Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.394190 4788 scope.go:117] "RemoveContainer" containerID="928812eee9cd494b61b87304c4ab4d58ffd651e9468a83240fec350eb56ec947" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.394343 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9hxf6_openshift-multus(a5c26787-29de-439a-86b8-920cac6c8ab8)\"" pod="openshift-multus/multus-9hxf6" podUID="a5c26787-29de-439a-86b8-920cac6c8ab8" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.415582 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.493467 4788 scope.go:117] "RemoveContainer" containerID="46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.517226 4788 scope.go:117] "RemoveContainer" containerID="81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.537755 4788 scope.go:117] "RemoveContainer" containerID="6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.546975 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xmshh"] Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.550585 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xmshh"] Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.577954 4788 scope.go:117] "RemoveContainer" containerID="a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.596522 4788 scope.go:117] "RemoveContainer" containerID="3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.613211 4788 scope.go:117] "RemoveContainer" containerID="85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.633365 4788 scope.go:117] "RemoveContainer" containerID="5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.665005 4788 scope.go:117] "RemoveContainer" containerID="81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.700342 4788 scope.go:117] "RemoveContainer" containerID="9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.701229 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": container with ID starting with 9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0 not found: ID does not exist" containerID="9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.701305 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} err="failed to get container status \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": rpc error: code = NotFound desc = could not find container \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": container with ID starting with 9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.701329 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.701570 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": container with ID starting with 0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e not found: ID does not exist" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.701585 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} err="failed to get container status \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": rpc error: code = NotFound desc = could not find container \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": container with ID starting with 0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.701597 4788 scope.go:117] "RemoveContainer" containerID="46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.703530 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": container with ID starting with 46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5 not found: ID does not exist" containerID="46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.703568 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} err="failed to get container status \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": rpc error: code = NotFound desc = could not find container \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": container with ID starting with 46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.703590 4788 scope.go:117] "RemoveContainer" containerID="81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.704035 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": container with ID starting with 81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd not found: ID does not exist" containerID="81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.704087 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} err="failed to get container status \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": rpc error: code = NotFound desc = could not find container \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": container with ID starting with 81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.704117 4788 scope.go:117] "RemoveContainer" containerID="6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.704402 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": container with ID starting with 6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740 not found: ID does not exist" containerID="6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.704438 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} err="failed to get container status \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": rpc error: code = NotFound desc = could not find container \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": container with ID starting with 6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.704464 4788 scope.go:117] "RemoveContainer" containerID="a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.704791 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": container with ID starting with a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f not found: ID does not exist" containerID="a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.704822 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} err="failed to get container status \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": rpc error: code = NotFound desc = could not find container \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": container with ID starting with a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.704842 4788 scope.go:117] "RemoveContainer" containerID="3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.705157 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": container with ID starting with 3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6 not found: ID does not exist" containerID="3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.705178 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} err="failed to get container status \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": rpc error: code = NotFound desc = could not find container \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": container with ID starting with 3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.705192 4788 scope.go:117] "RemoveContainer" containerID="85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.705450 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": container with ID starting with 85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429 not found: ID does not exist" containerID="85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.705467 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} err="failed to get container status \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": rpc error: code = NotFound desc = could not find container \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": container with ID starting with 85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.705480 4788 scope.go:117] "RemoveContainer" containerID="5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.705731 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": container with ID starting with 5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400 not found: ID does not exist" containerID="5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.705746 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} err="failed to get container status \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": rpc error: code = NotFound desc = could not find container \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": container with ID starting with 5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.705759 4788 scope.go:117] "RemoveContainer" containerID="81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6" Feb 19 08:55:17 crc kubenswrapper[4788]: E0219 08:55:17.705967 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": container with ID starting with 81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6 not found: ID does not exist" containerID="81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.705983 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} err="failed to get container status \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": rpc error: code = NotFound desc = could not find container \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": container with ID starting with 81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.705997 4788 scope.go:117] "RemoveContainer" containerID="9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.706293 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} err="failed to get container status \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": rpc error: code = NotFound desc = could not find container \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": container with ID starting with 9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.706316 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.706747 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} err="failed to get container status \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": rpc error: code = NotFound desc = could not find container \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": container with ID starting with 0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.706778 4788 scope.go:117] "RemoveContainer" containerID="46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.707030 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} err="failed to get container status \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": rpc error: code = NotFound desc = could not find container \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": container with ID starting with 46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.707067 4788 scope.go:117] "RemoveContainer" containerID="81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.707428 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} err="failed to get container status \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": rpc error: code = NotFound desc = could not find container \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": container with ID starting with 81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.707459 4788 scope.go:117] "RemoveContainer" containerID="6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.708016 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} err="failed to get container status \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": rpc error: code = NotFound desc = could not find container \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": container with ID starting with 6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.708036 4788 scope.go:117] "RemoveContainer" containerID="a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.708299 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} err="failed to get container status \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": rpc error: code = NotFound desc = could not find container \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": container with ID starting with a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.708320 4788 scope.go:117] "RemoveContainer" containerID="3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.708641 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} err="failed to get container status \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": rpc error: code = NotFound desc = could not find container \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": container with ID starting with 3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.708659 4788 scope.go:117] "RemoveContainer" containerID="85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.708970 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} err="failed to get container status \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": rpc error: code = NotFound desc = could not find container \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": container with ID starting with 85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.708986 4788 scope.go:117] "RemoveContainer" containerID="5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.709377 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} err="failed to get container status \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": rpc error: code = NotFound desc = could not find container \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": container with ID starting with 5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.709396 4788 scope.go:117] "RemoveContainer" containerID="81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.710042 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} err="failed to get container status \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": rpc error: code = NotFound desc = could not find container \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": container with ID starting with 81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.710071 4788 scope.go:117] "RemoveContainer" containerID="9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.710729 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} err="failed to get container status \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": rpc error: code = NotFound desc = could not find container \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": container with ID starting with 9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.710775 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.710994 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} err="failed to get container status \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": rpc error: code = NotFound desc = could not find container \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": container with ID starting with 0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.711038 4788 scope.go:117] "RemoveContainer" containerID="46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.711368 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} err="failed to get container status \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": rpc error: code = NotFound desc = could not find container \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": container with ID starting with 46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.711415 4788 scope.go:117] "RemoveContainer" containerID="81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.711650 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} err="failed to get container status \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": rpc error: code = NotFound desc = could not find container \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": container with ID starting with 81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.711673 4788 scope.go:117] "RemoveContainer" containerID="6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.712127 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} err="failed to get container status \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": rpc error: code = NotFound desc = could not find container \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": container with ID starting with 6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.712201 4788 scope.go:117] "RemoveContainer" containerID="a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.712561 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} err="failed to get container status \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": rpc error: code = NotFound desc = could not find container \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": container with ID starting with a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.712578 4788 scope.go:117] "RemoveContainer" containerID="3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.713736 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} err="failed to get container status \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": rpc error: code = NotFound desc = could not find container \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": container with ID starting with 3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.713758 4788 scope.go:117] "RemoveContainer" containerID="85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.714804 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} err="failed to get container status \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": rpc error: code = NotFound desc = could not find container \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": container with ID starting with 85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.714850 4788 scope.go:117] "RemoveContainer" containerID="5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.715107 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} err="failed to get container status \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": rpc error: code = NotFound desc = could not find container \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": container with ID starting with 5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.715128 4788 scope.go:117] "RemoveContainer" containerID="81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.715371 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} err="failed to get container status \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": rpc error: code = NotFound desc = could not find container \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": container with ID starting with 81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.715388 4788 scope.go:117] "RemoveContainer" containerID="9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.715598 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} err="failed to get container status \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": rpc error: code = NotFound desc = could not find container \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": container with ID starting with 9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.715613 4788 scope.go:117] "RemoveContainer" containerID="0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.715851 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e"} err="failed to get container status \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": rpc error: code = NotFound desc = could not find container \"0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e\": container with ID starting with 0eaa63158602a53197788596cce2f2cc93284e305453c3b97017d05b539e1d2e not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.715870 4788 scope.go:117] "RemoveContainer" containerID="46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.716080 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5"} err="failed to get container status \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": rpc error: code = NotFound desc = could not find container \"46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5\": container with ID starting with 46741a0dfd962809aa24dea375fc5075f475d1c03e3023cda35609b359c698d5 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.716101 4788 scope.go:117] "RemoveContainer" containerID="81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.716411 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd"} err="failed to get container status \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": rpc error: code = NotFound desc = could not find container \"81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd\": container with ID starting with 81409087ca54e4392bfacc054e370dbb8f116d3c6f68bb628586427d47b9ddfd not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.716435 4788 scope.go:117] "RemoveContainer" containerID="6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.716779 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740"} err="failed to get container status \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": rpc error: code = NotFound desc = could not find container \"6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740\": container with ID starting with 6013628ff26d29ced9ae0b1c4a04d274ed310f3a3057c247b3247fca1b2bf740 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.716800 4788 scope.go:117] "RemoveContainer" containerID="a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.717067 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f"} err="failed to get container status \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": rpc error: code = NotFound desc = could not find container \"a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f\": container with ID starting with a085ad647de4f55dfa50c24a57f3f0c397167a37d44f0fcb0a33abfb630a438f not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.717088 4788 scope.go:117] "RemoveContainer" containerID="3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.717311 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6"} err="failed to get container status \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": rpc error: code = NotFound desc = could not find container \"3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6\": container with ID starting with 3a814ddafc8e421a345617a60a7dc1673a12e61cdb59439750b39c661340d3a6 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.717339 4788 scope.go:117] "RemoveContainer" containerID="85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.717556 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429"} err="failed to get container status \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": rpc error: code = NotFound desc = could not find container \"85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429\": container with ID starting with 85b3d9bd614d329a8b263f2336e05b19ff5a729a52deee51041a1cb03c31e429 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.717575 4788 scope.go:117] "RemoveContainer" containerID="5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.717910 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400"} err="failed to get container status \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": rpc error: code = NotFound desc = could not find container \"5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400\": container with ID starting with 5879fe9f5423aad71b1740e9e4047037dd6f8c0cd031a46cee41ba0ec6b5f400 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.717955 4788 scope.go:117] "RemoveContainer" containerID="81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.718394 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6"} err="failed to get container status \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": rpc error: code = NotFound desc = could not find container \"81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6\": container with ID starting with 81ebe3a5fbac593fa94232d9fad13dd12490285b463b5be2d960d34e7d0d12c6 not found: ID does not exist" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.718435 4788 scope.go:117] "RemoveContainer" containerID="9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0" Feb 19 08:55:17 crc kubenswrapper[4788]: I0219 08:55:17.718703 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0"} err="failed to get container status \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": rpc error: code = NotFound desc = could not find container \"9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0\": container with ID starting with 9233add065696a3937c447d7b1c4bef6d7dad2f97632f335fff683881dcb2ea0 not found: ID does not exist" Feb 19 08:55:18 crc kubenswrapper[4788]: I0219 08:55:18.404444 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"1b036d7616e70399c071c6037507e24732a815e89a7b5e19b0b3db16e7ba1b3a"} Feb 19 08:55:18 crc kubenswrapper[4788]: I0219 08:55:18.404783 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"7a9ece6f70b155b46a5de5b42971fc945438ebe17eca7f4644ab42982bb332f6"} Feb 19 08:55:18 crc kubenswrapper[4788]: I0219 08:55:18.404795 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"69d2180dc8d72fd43d17983c83311ea7b3614d011f004e69ddc764fa800823be"} Feb 19 08:55:18 crc kubenswrapper[4788]: I0219 08:55:18.404806 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"a6d67294a13624cc2972695004952b9a93f098845737742e03ca4947849e3eb7"} Feb 19 08:55:18 crc kubenswrapper[4788]: I0219 08:55:18.404816 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"7b17203abe340fc91059c9f5946bf0250a16bb8071e852392bdd78d627cb8b6c"} Feb 19 08:55:18 crc kubenswrapper[4788]: I0219 08:55:18.404827 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"93f360fe32a481d376775ef12bebb24b263a3c3880ea634b06ee9766f28df4aa"} Feb 19 08:55:18 crc kubenswrapper[4788]: I0219 08:55:18.725056 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5c1c46-74a4-41f4-ad05-af438781bd6a" path="/var/lib/kubelet/pods/fd5c1c46-74a4-41f4-ad05-af438781bd6a/volumes" Feb 19 08:55:21 crc kubenswrapper[4788]: I0219 08:55:21.436138 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"009ae4178d7730fd0a5675d0671b6f4dbd398b86ef338b559390853e2ef3f424"} Feb 19 08:55:23 crc kubenswrapper[4788]: I0219 08:55:23.452867 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" event={"ID":"ca64b06d-2068-4946-b362-587da5f56a0c","Type":"ContainerStarted","Data":"e58acfd1d607252da7dfa50a36c87a41ad990beb46490fd283305070a6154e35"} Feb 19 08:55:23 crc kubenswrapper[4788]: I0219 08:55:23.453458 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:23 crc kubenswrapper[4788]: I0219 08:55:23.453478 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:23 crc kubenswrapper[4788]: I0219 08:55:23.484107 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:23 crc kubenswrapper[4788]: I0219 08:55:23.492048 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" podStartSLOduration=7.492031944 podStartE2EDuration="7.492031944s" podCreationTimestamp="2026-02-19 08:55:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:55:23.490651611 +0000 UTC m=+625.478663103" watchObservedRunningTime="2026-02-19 08:55:23.492031944 +0000 UTC m=+625.480043416" Feb 19 08:55:24 crc kubenswrapper[4788]: I0219 08:55:24.463167 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:24 crc kubenswrapper[4788]: I0219 08:55:24.501341 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:31 crc kubenswrapper[4788]: I0219 08:55:31.714575 4788 scope.go:117] "RemoveContainer" containerID="928812eee9cd494b61b87304c4ab4d58ffd651e9468a83240fec350eb56ec947" Feb 19 08:55:31 crc kubenswrapper[4788]: E0219 08:55:31.715776 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9hxf6_openshift-multus(a5c26787-29de-439a-86b8-920cac6c8ab8)\"" pod="openshift-multus/multus-9hxf6" podUID="a5c26787-29de-439a-86b8-920cac6c8ab8" Feb 19 08:55:45 crc kubenswrapper[4788]: I0219 08:55:45.714351 4788 scope.go:117] "RemoveContainer" containerID="928812eee9cd494b61b87304c4ab4d58ffd651e9468a83240fec350eb56ec947" Feb 19 08:55:46 crc kubenswrapper[4788]: I0219 08:55:46.627313 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/2.log" Feb 19 08:55:46 crc kubenswrapper[4788]: I0219 08:55:46.628650 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/1.log" Feb 19 08:55:46 crc kubenswrapper[4788]: I0219 08:55:46.628720 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hxf6" event={"ID":"a5c26787-29de-439a-86b8-920cac6c8ab8","Type":"ContainerStarted","Data":"012ff40798bbf0f49c00582ea8a2420194095c37b13d1f398e43b55ed8064cf3"} Feb 19 08:55:47 crc kubenswrapper[4788]: I0219 08:55:47.093369 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lvgws" Feb 19 08:55:50 crc kubenswrapper[4788]: I0219 08:55:50.964852 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs"] Feb 19 08:55:50 crc kubenswrapper[4788]: I0219 08:55:50.966300 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:50 crc kubenswrapper[4788]: I0219 08:55:50.968132 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 08:55:50 crc kubenswrapper[4788]: I0219 08:55:50.984081 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs"] Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.066727 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.066819 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.066870 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlt7v\" (UniqueName: \"kubernetes.io/projected/5dff47f0-9373-498b-b03f-fe5106d271b7-kube-api-access-rlt7v\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.168024 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlt7v\" (UniqueName: \"kubernetes.io/projected/5dff47f0-9373-498b-b03f-fe5106d271b7-kube-api-access-rlt7v\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.168167 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.168227 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.168946 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.169155 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.212110 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlt7v\" (UniqueName: \"kubernetes.io/projected/5dff47f0-9373-498b-b03f-fe5106d271b7-kube-api-access-rlt7v\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.280778 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:51 crc kubenswrapper[4788]: I0219 08:55:51.679436 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs"] Feb 19 08:55:51 crc kubenswrapper[4788]: W0219 08:55:51.691634 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dff47f0_9373_498b_b03f_fe5106d271b7.slice/crio-c409e6cd0df13533dc8122d731249e22ccf5837c5b2a49f275ffbd5a2e3b5688 WatchSource:0}: Error finding container c409e6cd0df13533dc8122d731249e22ccf5837c5b2a49f275ffbd5a2e3b5688: Status 404 returned error can't find the container with id c409e6cd0df13533dc8122d731249e22ccf5837c5b2a49f275ffbd5a2e3b5688 Feb 19 08:55:52 crc kubenswrapper[4788]: I0219 08:55:52.687136 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" event={"ID":"5dff47f0-9373-498b-b03f-fe5106d271b7","Type":"ContainerStarted","Data":"24d1d0c9e678a8ee9d6ff0a4004f4bcb5f4bc611713faaa6276d019501af2027"} Feb 19 08:55:52 crc kubenswrapper[4788]: I0219 08:55:52.687589 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" event={"ID":"5dff47f0-9373-498b-b03f-fe5106d271b7","Type":"ContainerStarted","Data":"c409e6cd0df13533dc8122d731249e22ccf5837c5b2a49f275ffbd5a2e3b5688"} Feb 19 08:55:53 crc kubenswrapper[4788]: I0219 08:55:53.696156 4788 generic.go:334] "Generic (PLEG): container finished" podID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerID="24d1d0c9e678a8ee9d6ff0a4004f4bcb5f4bc611713faaa6276d019501af2027" exitCode=0 Feb 19 08:55:53 crc kubenswrapper[4788]: I0219 08:55:53.696216 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" event={"ID":"5dff47f0-9373-498b-b03f-fe5106d271b7","Type":"ContainerDied","Data":"24d1d0c9e678a8ee9d6ff0a4004f4bcb5f4bc611713faaa6276d019501af2027"} Feb 19 08:55:55 crc kubenswrapper[4788]: I0219 08:55:55.712682 4788 generic.go:334] "Generic (PLEG): container finished" podID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerID="c034794dcd8b882145175f74f0ac9e60bf662c588d4f23ad9fe4155c11264d62" exitCode=0 Feb 19 08:55:55 crc kubenswrapper[4788]: I0219 08:55:55.712780 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" event={"ID":"5dff47f0-9373-498b-b03f-fe5106d271b7","Type":"ContainerDied","Data":"c034794dcd8b882145175f74f0ac9e60bf662c588d4f23ad9fe4155c11264d62"} Feb 19 08:55:56 crc kubenswrapper[4788]: I0219 08:55:56.723949 4788 generic.go:334] "Generic (PLEG): container finished" podID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerID="7ad93576b59d19c0c6d7edc40445d3e7eec9c2752e36675796c0836c1a5f7581" exitCode=0 Feb 19 08:55:56 crc kubenswrapper[4788]: I0219 08:55:56.726232 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" event={"ID":"5dff47f0-9373-498b-b03f-fe5106d271b7","Type":"ContainerDied","Data":"7ad93576b59d19c0c6d7edc40445d3e7eec9c2752e36675796c0836c1a5f7581"} Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.067121 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.176102 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlt7v\" (UniqueName: \"kubernetes.io/projected/5dff47f0-9373-498b-b03f-fe5106d271b7-kube-api-access-rlt7v\") pod \"5dff47f0-9373-498b-b03f-fe5106d271b7\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.176194 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-util\") pod \"5dff47f0-9373-498b-b03f-fe5106d271b7\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.176290 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-bundle\") pod \"5dff47f0-9373-498b-b03f-fe5106d271b7\" (UID: \"5dff47f0-9373-498b-b03f-fe5106d271b7\") " Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.177703 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-bundle" (OuterVolumeSpecName: "bundle") pod "5dff47f0-9373-498b-b03f-fe5106d271b7" (UID: "5dff47f0-9373-498b-b03f-fe5106d271b7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.184194 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dff47f0-9373-498b-b03f-fe5106d271b7-kube-api-access-rlt7v" (OuterVolumeSpecName: "kube-api-access-rlt7v") pod "5dff47f0-9373-498b-b03f-fe5106d271b7" (UID: "5dff47f0-9373-498b-b03f-fe5106d271b7"). InnerVolumeSpecName "kube-api-access-rlt7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.201435 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-util" (OuterVolumeSpecName: "util") pod "5dff47f0-9373-498b-b03f-fe5106d271b7" (UID: "5dff47f0-9373-498b-b03f-fe5106d271b7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.278703 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlt7v\" (UniqueName: \"kubernetes.io/projected/5dff47f0-9373-498b-b03f-fe5106d271b7-kube-api-access-rlt7v\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.278864 4788 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-util\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.278886 4788 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5dff47f0-9373-498b-b03f-fe5106d271b7-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.744664 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" event={"ID":"5dff47f0-9373-498b-b03f-fe5106d271b7","Type":"ContainerDied","Data":"c409e6cd0df13533dc8122d731249e22ccf5837c5b2a49f275ffbd5a2e3b5688"} Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.744741 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c409e6cd0df13533dc8122d731249e22ccf5837c5b2a49f275ffbd5a2e3b5688" Feb 19 08:55:58 crc kubenswrapper[4788]: I0219 08:55:58.744812 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs" Feb 19 08:55:59 crc kubenswrapper[4788]: I0219 08:55:59.051983 4788 scope.go:117] "RemoveContainer" containerID="13b1bb93d87b038211f1e816a2498a060120d6338c3cae845dff3c87bb6e924d" Feb 19 08:55:59 crc kubenswrapper[4788]: I0219 08:55:59.756188 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hxf6_a5c26787-29de-439a-86b8-920cac6c8ab8/kube-multus/2.log" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.337620 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qpczt"] Feb 19 08:56:02 crc kubenswrapper[4788]: E0219 08:56:02.338263 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerName="util" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.338283 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerName="util" Feb 19 08:56:02 crc kubenswrapper[4788]: E0219 08:56:02.338323 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerName="pull" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.338336 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerName="pull" Feb 19 08:56:02 crc kubenswrapper[4788]: E0219 08:56:02.338357 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerName="extract" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.338371 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerName="extract" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.338548 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dff47f0-9373-498b-b03f-fe5106d271b7" containerName="extract" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.339179 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-qpczt" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.340887 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.341063 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.341103 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-l7vg8" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.346343 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qpczt"] Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.438144 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f96z\" (UniqueName: \"kubernetes.io/projected/f57ff293-836e-4dbb-a634-1f74882dc23f-kube-api-access-2f96z\") pod \"nmstate-operator-694c9596b7-qpczt\" (UID: \"f57ff293-836e-4dbb-a634-1f74882dc23f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qpczt" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.539809 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f96z\" (UniqueName: \"kubernetes.io/projected/f57ff293-836e-4dbb-a634-1f74882dc23f-kube-api-access-2f96z\") pod \"nmstate-operator-694c9596b7-qpczt\" (UID: \"f57ff293-836e-4dbb-a634-1f74882dc23f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qpczt" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.563458 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f96z\" (UniqueName: \"kubernetes.io/projected/f57ff293-836e-4dbb-a634-1f74882dc23f-kube-api-access-2f96z\") pod \"nmstate-operator-694c9596b7-qpczt\" (UID: \"f57ff293-836e-4dbb-a634-1f74882dc23f\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qpczt" Feb 19 08:56:02 crc kubenswrapper[4788]: I0219 08:56:02.655495 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-qpczt" Feb 19 08:56:03 crc kubenswrapper[4788]: I0219 08:56:03.113351 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qpczt"] Feb 19 08:56:03 crc kubenswrapper[4788]: I0219 08:56:03.783430 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-qpczt" event={"ID":"f57ff293-836e-4dbb-a634-1f74882dc23f","Type":"ContainerStarted","Data":"91203c40d045c1365fe463fbcf5049c5c1a0dcfca90cb037ec1d3341ae5cfab8"} Feb 19 08:56:05 crc kubenswrapper[4788]: I0219 08:56:05.798623 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-qpczt" event={"ID":"f57ff293-836e-4dbb-a634-1f74882dc23f","Type":"ContainerStarted","Data":"3b5d6f53eff823ef0433ec48242dac8d4efa924c6267e3bcd1fff307b1dcbb45"} Feb 19 08:56:05 crc kubenswrapper[4788]: I0219 08:56:05.823384 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-qpczt" podStartSLOduration=2.21873361 podStartE2EDuration="3.823351025s" podCreationTimestamp="2026-02-19 08:56:02 +0000 UTC" firstStartedPulling="2026-02-19 08:56:03.119559749 +0000 UTC m=+665.107571221" lastFinishedPulling="2026-02-19 08:56:04.724177164 +0000 UTC m=+666.712188636" observedRunningTime="2026-02-19 08:56:05.822484015 +0000 UTC m=+667.810495487" watchObservedRunningTime="2026-02-19 08:56:05.823351025 +0000 UTC m=+667.811362537" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.307525 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qq85c"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.309206 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.311745 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-lth5s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.339290 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qq85c"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.347514 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.348406 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.350230 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.355475 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-s4s7s"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.356715 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.357733 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pl9f\" (UniqueName: \"kubernetes.io/projected/6bd96454-70db-4a47-808c-b377bbe1bd00-kube-api-access-9pl9f\") pod \"nmstate-metrics-58c85c668d-qq85c\" (UID: \"6bd96454-70db-4a47-808c-b377bbe1bd00\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.362759 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.451491 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.452155 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.454334 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.456772 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.456912 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-7bfkc" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.458859 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-nmstate-lock\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.458910 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pl9f\" (UniqueName: \"kubernetes.io/projected/6bd96454-70db-4a47-808c-b377bbe1bd00-kube-api-access-9pl9f\") pod \"nmstate-metrics-58c85c668d-qq85c\" (UID: \"6bd96454-70db-4a47-808c-b377bbe1bd00\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.458946 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/316cd090-6e25-48f3-89d6-21f11c7aafd9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-tvjrb\" (UID: \"316cd090-6e25-48f3-89d6-21f11c7aafd9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.458984 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj9l7\" (UniqueName: \"kubernetes.io/projected/5874210a-7a83-4166-8486-0f827772fc30-kube-api-access-nj9l7\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.459023 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptnvs\" (UniqueName: \"kubernetes.io/projected/316cd090-6e25-48f3-89d6-21f11c7aafd9-kube-api-access-ptnvs\") pod \"nmstate-webhook-866bcb46dc-tvjrb\" (UID: \"316cd090-6e25-48f3-89d6-21f11c7aafd9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.459185 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-ovs-socket\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.459356 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-dbus-socket\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.469481 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.485414 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pl9f\" (UniqueName: \"kubernetes.io/projected/6bd96454-70db-4a47-808c-b377bbe1bd00-kube-api-access-9pl9f\") pod \"nmstate-metrics-58c85c668d-qq85c\" (UID: \"6bd96454-70db-4a47-808c-b377bbe1bd00\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561056 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj9l7\" (UniqueName: \"kubernetes.io/projected/5874210a-7a83-4166-8486-0f827772fc30-kube-api-access-nj9l7\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561175 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561234 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptnvs\" (UniqueName: \"kubernetes.io/projected/316cd090-6e25-48f3-89d6-21f11c7aafd9-kube-api-access-ptnvs\") pod \"nmstate-webhook-866bcb46dc-tvjrb\" (UID: \"316cd090-6e25-48f3-89d6-21f11c7aafd9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561356 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-ovs-socket\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561417 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-dbus-socket\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561458 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdx9w\" (UniqueName: \"kubernetes.io/projected/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-kube-api-access-pdx9w\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561525 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561562 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-nmstate-lock\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561613 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/316cd090-6e25-48f3-89d6-21f11c7aafd9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-tvjrb\" (UID: \"316cd090-6e25-48f3-89d6-21f11c7aafd9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561726 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-ovs-socket\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561825 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-nmstate-lock\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: E0219 08:56:11.561829 4788 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.561906 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5874210a-7a83-4166-8486-0f827772fc30-dbus-socket\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: E0219 08:56:11.561936 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/316cd090-6e25-48f3-89d6-21f11c7aafd9-tls-key-pair podName:316cd090-6e25-48f3-89d6-21f11c7aafd9 nodeName:}" failed. No retries permitted until 2026-02-19 08:56:12.061899937 +0000 UTC m=+674.049911449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/316cd090-6e25-48f3-89d6-21f11c7aafd9-tls-key-pair") pod "nmstate-webhook-866bcb46dc-tvjrb" (UID: "316cd090-6e25-48f3-89d6-21f11c7aafd9") : secret "openshift-nmstate-webhook" not found Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.585123 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj9l7\" (UniqueName: \"kubernetes.io/projected/5874210a-7a83-4166-8486-0f827772fc30-kube-api-access-nj9l7\") pod \"nmstate-handler-s4s7s\" (UID: \"5874210a-7a83-4166-8486-0f827772fc30\") " pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.594861 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptnvs\" (UniqueName: \"kubernetes.io/projected/316cd090-6e25-48f3-89d6-21f11c7aafd9-kube-api-access-ptnvs\") pod \"nmstate-webhook-866bcb46dc-tvjrb\" (UID: \"316cd090-6e25-48f3-89d6-21f11c7aafd9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.629466 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.663827 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.663976 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.664038 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdx9w\" (UniqueName: \"kubernetes.io/projected/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-kube-api-access-pdx9w\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.665442 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.674407 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.677215 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.691962 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdx9w\" (UniqueName: \"kubernetes.io/projected/cc4fe95c-3af5-4be2-bf60-2c5e02c18df9-kube-api-access-pdx9w\") pod \"nmstate-console-plugin-5c78fc5d65-4s4xp\" (UID: \"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.696111 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-68896946c5-t55qf"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.696741 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.724390 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68896946c5-t55qf"] Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.765043 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-service-ca\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.765092 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-oauth-config\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.765110 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-oauth-serving-cert\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.765144 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-trusted-ca-bundle\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.765168 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-serving-cert\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.765205 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd8kx\" (UniqueName: \"kubernetes.io/projected/71e2c148-aade-4c9c-a8d8-a837a8b08658-kube-api-access-dd8kx\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.765291 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-config\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.765943 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.843322 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4s7s" event={"ID":"5874210a-7a83-4166-8486-0f827772fc30","Type":"ContainerStarted","Data":"04a3188cdfe97559719cd7b17b11e7498ae4149334e05c480287e3dd23277e70"} Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.867341 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-service-ca\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.867471 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-oauth-config\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.867499 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-oauth-serving-cert\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.867566 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-trusted-ca-bundle\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.867601 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-serving-cert\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.867662 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd8kx\" (UniqueName: \"kubernetes.io/projected/71e2c148-aade-4c9c-a8d8-a837a8b08658-kube-api-access-dd8kx\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.867761 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-config\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.868822 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-config\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.869495 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-service-ca\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.871148 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-oauth-serving-cert\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.872460 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71e2c148-aade-4c9c-a8d8-a837a8b08658-trusted-ca-bundle\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.877868 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-serving-cert\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.878427 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71e2c148-aade-4c9c-a8d8-a837a8b08658-console-oauth-config\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:11 crc kubenswrapper[4788]: I0219 08:56:11.890180 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd8kx\" (UniqueName: \"kubernetes.io/projected/71e2c148-aade-4c9c-a8d8-a837a8b08658-kube-api-access-dd8kx\") pod \"console-68896946c5-t55qf\" (UID: \"71e2c148-aade-4c9c-a8d8-a837a8b08658\") " pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.025094 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.070517 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/316cd090-6e25-48f3-89d6-21f11c7aafd9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-tvjrb\" (UID: \"316cd090-6e25-48f3-89d6-21f11c7aafd9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.080957 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/316cd090-6e25-48f3-89d6-21f11c7aafd9-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-tvjrb\" (UID: \"316cd090-6e25-48f3-89d6-21f11c7aafd9\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.102815 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-qq85c"] Feb 19 08:56:12 crc kubenswrapper[4788]: W0219 08:56:12.116656 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd96454_70db_4a47_808c_b377bbe1bd00.slice/crio-10762f8b3bc6c5c8b92083a6b46680da2dcfc22fe16839f9a684bbf34f7e48f6 WatchSource:0}: Error finding container 10762f8b3bc6c5c8b92083a6b46680da2dcfc22fe16839f9a684bbf34f7e48f6: Status 404 returned error can't find the container with id 10762f8b3bc6c5c8b92083a6b46680da2dcfc22fe16839f9a684bbf34f7e48f6 Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.191054 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp"] Feb 19 08:56:12 crc kubenswrapper[4788]: W0219 08:56:12.198928 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc4fe95c_3af5_4be2_bf60_2c5e02c18df9.slice/crio-5ce3407daf63c85beae1461f4aba61932ee3b9f15ced9b5435e455f0426875f5 WatchSource:0}: Error finding container 5ce3407daf63c85beae1461f4aba61932ee3b9f15ced9b5435e455f0426875f5: Status 404 returned error can't find the container with id 5ce3407daf63c85beae1461f4aba61932ee3b9f15ced9b5435e455f0426875f5 Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.244186 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68896946c5-t55qf"] Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.269206 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.759154 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb"] Feb 19 08:56:12 crc kubenswrapper[4788]: W0219 08:56:12.769010 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod316cd090_6e25_48f3_89d6_21f11c7aafd9.slice/crio-73472fe9704abcb0405bc5e3af173a8d51155639e0c2fad9f43ba7d70d2730a5 WatchSource:0}: Error finding container 73472fe9704abcb0405bc5e3af173a8d51155639e0c2fad9f43ba7d70d2730a5: Status 404 returned error can't find the container with id 73472fe9704abcb0405bc5e3af173a8d51155639e0c2fad9f43ba7d70d2730a5 Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.854503 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" event={"ID":"316cd090-6e25-48f3-89d6-21f11c7aafd9","Type":"ContainerStarted","Data":"73472fe9704abcb0405bc5e3af173a8d51155639e0c2fad9f43ba7d70d2730a5"} Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.857376 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68896946c5-t55qf" event={"ID":"71e2c148-aade-4c9c-a8d8-a837a8b08658","Type":"ContainerStarted","Data":"0ccafa5cd57a2beb0f5d09738870bbd4b75c05ed1a9d6ab0f69870d88c1ff736"} Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.857424 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68896946c5-t55qf" event={"ID":"71e2c148-aade-4c9c-a8d8-a837a8b08658","Type":"ContainerStarted","Data":"bc6892f587e0567c9f869c799028d4f7e95bb7b68ed4db707653bdb5e222add9"} Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.859335 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" event={"ID":"6bd96454-70db-4a47-808c-b377bbe1bd00","Type":"ContainerStarted","Data":"10762f8b3bc6c5c8b92083a6b46680da2dcfc22fe16839f9a684bbf34f7e48f6"} Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.861120 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" event={"ID":"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9","Type":"ContainerStarted","Data":"5ce3407daf63c85beae1461f4aba61932ee3b9f15ced9b5435e455f0426875f5"} Feb 19 08:56:12 crc kubenswrapper[4788]: I0219 08:56:12.881993 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68896946c5-t55qf" podStartSLOduration=1.881967135 podStartE2EDuration="1.881967135s" podCreationTimestamp="2026-02-19 08:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:56:12.878754171 +0000 UTC m=+674.866765643" watchObservedRunningTime="2026-02-19 08:56:12.881967135 +0000 UTC m=+674.869978607" Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.886396 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s4s7s" event={"ID":"5874210a-7a83-4166-8486-0f827772fc30","Type":"ContainerStarted","Data":"dee037cab3e25baf1e2b54c63f65435c400a4c9e3d15e4408a6e43933244abe7"} Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.887086 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.889180 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" event={"ID":"cc4fe95c-3af5-4be2-bf60-2c5e02c18df9","Type":"ContainerStarted","Data":"bb0727d1ee980d2bf06e3821a121aeb6b86cb7d742bddbe8bc136426ebc5f4da"} Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.891579 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" event={"ID":"316cd090-6e25-48f3-89d6-21f11c7aafd9","Type":"ContainerStarted","Data":"49b67c73c7e37d8d83ac4de56dd6d0731120886820229dfdec4a94da719e6ad2"} Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.891782 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.893736 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" event={"ID":"6bd96454-70db-4a47-808c-b377bbe1bd00","Type":"ContainerStarted","Data":"3d16a5ba762d01076703d11c5db344aaf90ffbcb56264ec3ec8799a8198d49f3"} Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.916501 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-s4s7s" podStartSLOduration=1.86299042 podStartE2EDuration="4.916472406s" podCreationTimestamp="2026-02-19 08:56:11 +0000 UTC" firstStartedPulling="2026-02-19 08:56:11.725588252 +0000 UTC m=+673.713599724" lastFinishedPulling="2026-02-19 08:56:14.779070228 +0000 UTC m=+676.767081710" observedRunningTime="2026-02-19 08:56:15.910824977 +0000 UTC m=+677.898836449" watchObservedRunningTime="2026-02-19 08:56:15.916472406 +0000 UTC m=+677.904483878" Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.930122 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" podStartSLOduration=2.92417538 podStartE2EDuration="4.930103289s" podCreationTimestamp="2026-02-19 08:56:11 +0000 UTC" firstStartedPulling="2026-02-19 08:56:12.771584803 +0000 UTC m=+674.759596275" lastFinishedPulling="2026-02-19 08:56:14.777512702 +0000 UTC m=+676.765524184" observedRunningTime="2026-02-19 08:56:15.924532151 +0000 UTC m=+677.912543623" watchObservedRunningTime="2026-02-19 08:56:15.930103289 +0000 UTC m=+677.918114761" Feb 19 08:56:15 crc kubenswrapper[4788]: I0219 08:56:15.954916 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-4s4xp" podStartSLOduration=2.378389242 podStartE2EDuration="4.954901608s" podCreationTimestamp="2026-02-19 08:56:11 +0000 UTC" firstStartedPulling="2026-02-19 08:56:12.203546695 +0000 UTC m=+674.191558167" lastFinishedPulling="2026-02-19 08:56:14.780059021 +0000 UTC m=+676.768070533" observedRunningTime="2026-02-19 08:56:15.954151871 +0000 UTC m=+677.942163383" watchObservedRunningTime="2026-02-19 08:56:15.954901608 +0000 UTC m=+677.942913080" Feb 19 08:56:17 crc kubenswrapper[4788]: I0219 08:56:17.906910 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" event={"ID":"6bd96454-70db-4a47-808c-b377bbe1bd00","Type":"ContainerStarted","Data":"70d4ea6a55432079e1735a71c1f6d5ee806a91bc95b1237740b3032f1aba944b"} Feb 19 08:56:17 crc kubenswrapper[4788]: I0219 08:56:17.934654 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-qq85c" podStartSLOduration=2.188089007 podStartE2EDuration="6.934634556s" podCreationTimestamp="2026-02-19 08:56:11 +0000 UTC" firstStartedPulling="2026-02-19 08:56:12.121898842 +0000 UTC m=+674.109910324" lastFinishedPulling="2026-02-19 08:56:16.868444401 +0000 UTC m=+678.856455873" observedRunningTime="2026-02-19 08:56:17.932306912 +0000 UTC m=+679.920318464" watchObservedRunningTime="2026-02-19 08:56:17.934634556 +0000 UTC m=+679.922646038" Feb 19 08:56:21 crc kubenswrapper[4788]: I0219 08:56:21.714984 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-s4s7s" Feb 19 08:56:22 crc kubenswrapper[4788]: I0219 08:56:22.025515 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:22 crc kubenswrapper[4788]: I0219 08:56:22.025868 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:22 crc kubenswrapper[4788]: I0219 08:56:22.031133 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:22 crc kubenswrapper[4788]: I0219 08:56:22.946696 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68896946c5-t55qf" Feb 19 08:56:23 crc kubenswrapper[4788]: I0219 08:56:23.040405 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jr6pt"] Feb 19 08:56:32 crc kubenswrapper[4788]: I0219 08:56:32.280635 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-tvjrb" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.174159 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62"] Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.177215 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.186429 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.191398 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62"] Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.291103 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.291274 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7hn8\" (UniqueName: \"kubernetes.io/projected/c1d75f31-f259-4276-aadb-af4b0540b221-kube-api-access-m7hn8\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.291397 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.392894 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.393051 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.393101 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7hn8\" (UniqueName: \"kubernetes.io/projected/c1d75f31-f259-4276-aadb-af4b0540b221-kube-api-access-m7hn8\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.393774 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.393958 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.437459 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7hn8\" (UniqueName: \"kubernetes.io/projected/c1d75f31-f259-4276-aadb-af4b0540b221-kube-api-access-m7hn8\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.499099 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:45 crc kubenswrapper[4788]: I0219 08:56:45.966274 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62"] Feb 19 08:56:46 crc kubenswrapper[4788]: I0219 08:56:46.131672 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" event={"ID":"c1d75f31-f259-4276-aadb-af4b0540b221","Type":"ContainerStarted","Data":"195ef8efb26ac636d18b33f86dde43c5c6477984a844b473c55eac5e2b805302"} Feb 19 08:56:47 crc kubenswrapper[4788]: I0219 08:56:47.142599 4788 generic.go:334] "Generic (PLEG): container finished" podID="c1d75f31-f259-4276-aadb-af4b0540b221" containerID="fd8ddd49e461db4ecd38d5cb5d4f56a21dc6d76c5d83cc1a291b25c405bb2ddc" exitCode=0 Feb 19 08:56:47 crc kubenswrapper[4788]: I0219 08:56:47.142714 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" event={"ID":"c1d75f31-f259-4276-aadb-af4b0540b221","Type":"ContainerDied","Data":"fd8ddd49e461db4ecd38d5cb5d4f56a21dc6d76c5d83cc1a291b25c405bb2ddc"} Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.093127 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-jr6pt" podUID="7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" containerName="console" containerID="cri-o://0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c" gracePeriod=15 Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.154553 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" event={"ID":"c1d75f31-f259-4276-aadb-af4b0540b221","Type":"ContainerStarted","Data":"ff50911e43ed1a3074972c0cdd0d0b869e015df90ed24ef333e23afcd7c22968"} Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.485818 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jr6pt_7108fd8d-57c8-42b0-9fe2-08ca6b33b2de/console/0.log" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.485916 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.641688 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-serving-cert\") pod \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.641764 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-oauth-config\") pod \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.641834 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-config\") pod \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.641924 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-trusted-ca-bundle\") pod \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.641975 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-oauth-serving-cert\") pod \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.642130 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-service-ca\") pod \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.642209 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4rc2\" (UniqueName: \"kubernetes.io/projected/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-kube-api-access-r4rc2\") pod \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\" (UID: \"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de\") " Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.643211 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-service-ca" (OuterVolumeSpecName: "service-ca") pod "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" (UID: "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.643241 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" (UID: "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.643286 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" (UID: "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.644082 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-config" (OuterVolumeSpecName: "console-config") pod "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" (UID: "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.649924 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" (UID: "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.650504 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-kube-api-access-r4rc2" (OuterVolumeSpecName: "kube-api-access-r4rc2") pod "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" (UID: "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de"). InnerVolumeSpecName "kube-api-access-r4rc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.651030 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" (UID: "7108fd8d-57c8-42b0-9fe2-08ca6b33b2de"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.743663 4788 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.743711 4788 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.743724 4788 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-service-ca\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.743738 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4rc2\" (UniqueName: \"kubernetes.io/projected/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-kube-api-access-r4rc2\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.743754 4788 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.743767 4788 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:48 crc kubenswrapper[4788]: I0219 08:56:48.743778 4788 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de-console-config\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.164286 4788 generic.go:334] "Generic (PLEG): container finished" podID="c1d75f31-f259-4276-aadb-af4b0540b221" containerID="ff50911e43ed1a3074972c0cdd0d0b869e015df90ed24ef333e23afcd7c22968" exitCode=0 Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.164399 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" event={"ID":"c1d75f31-f259-4276-aadb-af4b0540b221","Type":"ContainerDied","Data":"ff50911e43ed1a3074972c0cdd0d0b869e015df90ed24ef333e23afcd7c22968"} Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.167291 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jr6pt_7108fd8d-57c8-42b0-9fe2-08ca6b33b2de/console/0.log" Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.167376 4788 generic.go:334] "Generic (PLEG): container finished" podID="7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" containerID="0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c" exitCode=2 Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.167437 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jr6pt" event={"ID":"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de","Type":"ContainerDied","Data":"0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c"} Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.167491 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jr6pt" event={"ID":"7108fd8d-57c8-42b0-9fe2-08ca6b33b2de","Type":"ContainerDied","Data":"791fada88e208cfadeb5033fc2ed56c28681c9f00b0f58d630deda63ba221aaf"} Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.167529 4788 scope.go:117] "RemoveContainer" containerID="0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c" Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.167444 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jr6pt" Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.204612 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jr6pt"] Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.209134 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-jr6pt"] Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.217999 4788 scope.go:117] "RemoveContainer" containerID="0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c" Feb 19 08:56:49 crc kubenswrapper[4788]: E0219 08:56:49.218677 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c\": container with ID starting with 0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c not found: ID does not exist" containerID="0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c" Feb 19 08:56:49 crc kubenswrapper[4788]: I0219 08:56:49.218735 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c"} err="failed to get container status \"0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c\": rpc error: code = NotFound desc = could not find container \"0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c\": container with ID starting with 0521f6abb95a791c094c99f5180076c061808cf25ac28d31c7f7fb4f3e42cb8c not found: ID does not exist" Feb 19 08:56:50 crc kubenswrapper[4788]: I0219 08:56:50.177847 4788 generic.go:334] "Generic (PLEG): container finished" podID="c1d75f31-f259-4276-aadb-af4b0540b221" containerID="9e932673ebe8652340cc4c0edca0bf560394fd5269084497544238c4697ed353" exitCode=0 Feb 19 08:56:50 crc kubenswrapper[4788]: I0219 08:56:50.177962 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" event={"ID":"c1d75f31-f259-4276-aadb-af4b0540b221","Type":"ContainerDied","Data":"9e932673ebe8652340cc4c0edca0bf560394fd5269084497544238c4697ed353"} Feb 19 08:56:50 crc kubenswrapper[4788]: I0219 08:56:50.728368 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" path="/var/lib/kubelet/pods/7108fd8d-57c8-42b0-9fe2-08ca6b33b2de/volumes" Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.490130 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.598629 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7hn8\" (UniqueName: \"kubernetes.io/projected/c1d75f31-f259-4276-aadb-af4b0540b221-kube-api-access-m7hn8\") pod \"c1d75f31-f259-4276-aadb-af4b0540b221\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.598751 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-util\") pod \"c1d75f31-f259-4276-aadb-af4b0540b221\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.598884 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-bundle\") pod \"c1d75f31-f259-4276-aadb-af4b0540b221\" (UID: \"c1d75f31-f259-4276-aadb-af4b0540b221\") " Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.600911 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-bundle" (OuterVolumeSpecName: "bundle") pod "c1d75f31-f259-4276-aadb-af4b0540b221" (UID: "c1d75f31-f259-4276-aadb-af4b0540b221"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.605064 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d75f31-f259-4276-aadb-af4b0540b221-kube-api-access-m7hn8" (OuterVolumeSpecName: "kube-api-access-m7hn8") pod "c1d75f31-f259-4276-aadb-af4b0540b221" (UID: "c1d75f31-f259-4276-aadb-af4b0540b221"). InnerVolumeSpecName "kube-api-access-m7hn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.700971 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7hn8\" (UniqueName: \"kubernetes.io/projected/c1d75f31-f259-4276-aadb-af4b0540b221-kube-api-access-m7hn8\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.701001 4788 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:51 crc kubenswrapper[4788]: I0219 08:56:51.916306 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-util" (OuterVolumeSpecName: "util") pod "c1d75f31-f259-4276-aadb-af4b0540b221" (UID: "c1d75f31-f259-4276-aadb-af4b0540b221"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:56:52 crc kubenswrapper[4788]: I0219 08:56:52.005604 4788 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1d75f31-f259-4276-aadb-af4b0540b221-util\") on node \"crc\" DevicePath \"\"" Feb 19 08:56:52 crc kubenswrapper[4788]: I0219 08:56:52.139766 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:56:52 crc kubenswrapper[4788]: I0219 08:56:52.139869 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:56:52 crc kubenswrapper[4788]: I0219 08:56:52.191953 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" event={"ID":"c1d75f31-f259-4276-aadb-af4b0540b221","Type":"ContainerDied","Data":"195ef8efb26ac636d18b33f86dde43c5c6477984a844b473c55eac5e2b805302"} Feb 19 08:56:52 crc kubenswrapper[4788]: I0219 08:56:52.192036 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="195ef8efb26ac636d18b33f86dde43c5c6477984a844b473c55eac5e2b805302" Feb 19 08:56:52 crc kubenswrapper[4788]: I0219 08:56:52.192063 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.095753 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k"] Feb 19 08:57:00 crc kubenswrapper[4788]: E0219 08:57:00.096442 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" containerName="console" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.096453 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" containerName="console" Feb 19 08:57:00 crc kubenswrapper[4788]: E0219 08:57:00.096467 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d75f31-f259-4276-aadb-af4b0540b221" containerName="extract" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.096473 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d75f31-f259-4276-aadb-af4b0540b221" containerName="extract" Feb 19 08:57:00 crc kubenswrapper[4788]: E0219 08:57:00.096482 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d75f31-f259-4276-aadb-af4b0540b221" containerName="util" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.096488 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d75f31-f259-4276-aadb-af4b0540b221" containerName="util" Feb 19 08:57:00 crc kubenswrapper[4788]: E0219 08:57:00.096502 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d75f31-f259-4276-aadb-af4b0540b221" containerName="pull" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.096508 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d75f31-f259-4276-aadb-af4b0540b221" containerName="pull" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.096595 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="7108fd8d-57c8-42b0-9fe2-08ca6b33b2de" containerName="console" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.096605 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d75f31-f259-4276-aadb-af4b0540b221" containerName="extract" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.097041 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.098740 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.098832 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.099059 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hlfsx" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.099238 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.109602 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.119795 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k"] Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.228459 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb7fcd0f-b451-422c-8984-1494da7aec38-webhook-cert\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.228537 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttkwb\" (UniqueName: \"kubernetes.io/projected/fb7fcd0f-b451-422c-8984-1494da7aec38-kube-api-access-ttkwb\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.228574 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb7fcd0f-b451-422c-8984-1494da7aec38-apiservice-cert\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.329838 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb7fcd0f-b451-422c-8984-1494da7aec38-webhook-cert\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.329917 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttkwb\" (UniqueName: \"kubernetes.io/projected/fb7fcd0f-b451-422c-8984-1494da7aec38-kube-api-access-ttkwb\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.329959 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb7fcd0f-b451-422c-8984-1494da7aec38-apiservice-cert\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.335884 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fb7fcd0f-b451-422c-8984-1494da7aec38-apiservice-cert\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.336909 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fb7fcd0f-b451-422c-8984-1494da7aec38-webhook-cert\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.339501 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk"] Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.340141 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.341520 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-gg8xq" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.342628 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.343679 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.355466 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttkwb\" (UniqueName: \"kubernetes.io/projected/fb7fcd0f-b451-422c-8984-1494da7aec38-kube-api-access-ttkwb\") pod \"metallb-operator-controller-manager-56888c5b56-9lw8k\" (UID: \"fb7fcd0f-b451-422c-8984-1494da7aec38\") " pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.367917 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk"] Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.415315 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.430882 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/608a07cc-88f0-405b-87de-f43cc5ee8989-webhook-cert\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.430945 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/608a07cc-88f0-405b-87de-f43cc5ee8989-apiservice-cert\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.430972 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46hhr\" (UniqueName: \"kubernetes.io/projected/608a07cc-88f0-405b-87de-f43cc5ee8989-kube-api-access-46hhr\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.536026 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/608a07cc-88f0-405b-87de-f43cc5ee8989-webhook-cert\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.536512 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/608a07cc-88f0-405b-87de-f43cc5ee8989-apiservice-cert\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.536550 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46hhr\" (UniqueName: \"kubernetes.io/projected/608a07cc-88f0-405b-87de-f43cc5ee8989-kube-api-access-46hhr\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.543725 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/608a07cc-88f0-405b-87de-f43cc5ee8989-apiservice-cert\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.556737 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46hhr\" (UniqueName: \"kubernetes.io/projected/608a07cc-88f0-405b-87de-f43cc5ee8989-kube-api-access-46hhr\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.560147 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/608a07cc-88f0-405b-87de-f43cc5ee8989-webhook-cert\") pod \"metallb-operator-webhook-server-6b56bc649d-sz2mk\" (UID: \"608a07cc-88f0-405b-87de-f43cc5ee8989\") " pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.640793 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k"] Feb 19 08:57:00 crc kubenswrapper[4788]: I0219 08:57:00.694739 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:01 crc kubenswrapper[4788]: I0219 08:57:01.191824 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk"] Feb 19 08:57:01 crc kubenswrapper[4788]: W0219 08:57:01.196387 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod608a07cc_88f0_405b_87de_f43cc5ee8989.slice/crio-e2137a53e4d515b7051dffb319e44665a7aad2f4408790e42381a9f34fac9ca9 WatchSource:0}: Error finding container e2137a53e4d515b7051dffb319e44665a7aad2f4408790e42381a9f34fac9ca9: Status 404 returned error can't find the container with id e2137a53e4d515b7051dffb319e44665a7aad2f4408790e42381a9f34fac9ca9 Feb 19 08:57:01 crc kubenswrapper[4788]: I0219 08:57:01.250492 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" event={"ID":"fb7fcd0f-b451-422c-8984-1494da7aec38","Type":"ContainerStarted","Data":"659d86b0973f89abbe4f8fddc85b9de1b35dc1a98e765cfb964c0b5619206804"} Feb 19 08:57:01 crc kubenswrapper[4788]: I0219 08:57:01.251681 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" event={"ID":"608a07cc-88f0-405b-87de-f43cc5ee8989","Type":"ContainerStarted","Data":"e2137a53e4d515b7051dffb319e44665a7aad2f4408790e42381a9f34fac9ca9"} Feb 19 08:57:04 crc kubenswrapper[4788]: I0219 08:57:04.270090 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" event={"ID":"fb7fcd0f-b451-422c-8984-1494da7aec38","Type":"ContainerStarted","Data":"f3d32bacacad440d5c5707d62ad1cb0cb2b6b28363af214b8e7082737cee96db"} Feb 19 08:57:04 crc kubenswrapper[4788]: I0219 08:57:04.271561 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:04 crc kubenswrapper[4788]: I0219 08:57:04.289617 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" podStartSLOduration=1.648565272 podStartE2EDuration="4.289597651s" podCreationTimestamp="2026-02-19 08:57:00 +0000 UTC" firstStartedPulling="2026-02-19 08:57:00.652519113 +0000 UTC m=+722.640530585" lastFinishedPulling="2026-02-19 08:57:03.293551492 +0000 UTC m=+725.281562964" observedRunningTime="2026-02-19 08:57:04.288202978 +0000 UTC m=+726.276214450" watchObservedRunningTime="2026-02-19 08:57:04.289597651 +0000 UTC m=+726.277609123" Feb 19 08:57:06 crc kubenswrapper[4788]: I0219 08:57:06.281805 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" event={"ID":"608a07cc-88f0-405b-87de-f43cc5ee8989","Type":"ContainerStarted","Data":"92eda0033768ba2ca23747bd91410a4e3e6932c0148ae11a1843a354c77af2f4"} Feb 19 08:57:06 crc kubenswrapper[4788]: I0219 08:57:06.282144 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:06 crc kubenswrapper[4788]: I0219 08:57:06.300574 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" podStartSLOduration=2.2674907859999998 podStartE2EDuration="6.300554774s" podCreationTimestamp="2026-02-19 08:57:00 +0000 UTC" firstStartedPulling="2026-02-19 08:57:01.199321795 +0000 UTC m=+723.187333267" lastFinishedPulling="2026-02-19 08:57:05.232385783 +0000 UTC m=+727.220397255" observedRunningTime="2026-02-19 08:57:06.298514766 +0000 UTC m=+728.286526268" watchObservedRunningTime="2026-02-19 08:57:06.300554774 +0000 UTC m=+728.288566246" Feb 19 08:57:20 crc kubenswrapper[4788]: I0219 08:57:20.698869 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6b56bc649d-sz2mk" Feb 19 08:57:22 crc kubenswrapper[4788]: I0219 08:57:22.139928 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:57:22 crc kubenswrapper[4788]: I0219 08:57:22.140015 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:57:33 crc kubenswrapper[4788]: I0219 08:57:33.969898 4788 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 19 08:57:40 crc kubenswrapper[4788]: I0219 08:57:40.419536 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-56888c5b56-9lw8k" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.228410 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn"] Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.229478 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.231489 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.231753 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-h9lb6"] Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.232098 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-s7mbw" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.234227 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.235616 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.237084 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.283456 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn"] Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297481 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-reloader\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297527 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-conf\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297553 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47634f49-df39-43a3-8c1a-850fa890d4dc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-25kjn\" (UID: \"47634f49-df39-43a3-8c1a-850fa890d4dc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297572 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297594 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6qvm\" (UniqueName: \"kubernetes.io/projected/479438a3-ab89-4f1f-a1b8-01ac3c012454-kube-api-access-d6qvm\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297623 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-sockets\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297646 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-startup\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297666 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics-certs\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.297683 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jb99\" (UniqueName: \"kubernetes.io/projected/47634f49-df39-43a3-8c1a-850fa890d4dc-kube-api-access-8jb99\") pod \"frr-k8s-webhook-server-78b44bf5bb-25kjn\" (UID: \"47634f49-df39-43a3-8c1a-850fa890d4dc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.302705 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-64rbq"] Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.303794 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.305917 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.306008 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.306141 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.308081 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fgfcg" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.310281 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-5z7c2"] Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.311152 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.314608 4788 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.323944 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-5z7c2"] Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399195 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-reloader\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399271 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-conf\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399314 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47634f49-df39-43a3-8c1a-850fa890d4dc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-25kjn\" (UID: \"47634f49-df39-43a3-8c1a-850fa890d4dc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399351 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399383 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6qvm\" (UniqueName: \"kubernetes.io/projected/479438a3-ab89-4f1f-a1b8-01ac3c012454-kube-api-access-d6qvm\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399425 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfw6t\" (UniqueName: \"kubernetes.io/projected/d0c713d1-6916-421f-8876-757d3d7dfa45-kube-api-access-lfw6t\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399454 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-sockets\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399480 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-cert\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399534 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw9rk\" (UniqueName: \"kubernetes.io/projected/bd37a23b-542c-489a-90f2-ed7b82c59ec0-kube-api-access-jw9rk\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399564 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-startup\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399639 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-reloader\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399645 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics-certs\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399677 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399712 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jb99\" (UniqueName: \"kubernetes.io/projected/47634f49-df39-43a3-8c1a-850fa890d4dc-kube-api-access-8jb99\") pod \"frr-k8s-webhook-server-78b44bf5bb-25kjn\" (UID: \"47634f49-df39-43a3-8c1a-850fa890d4dc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399734 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-metrics-certs\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: E0219 08:57:41.399738 4788 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 19 08:57:41 crc kubenswrapper[4788]: E0219 08:57:41.399782 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics-certs podName:479438a3-ab89-4f1f-a1b8-01ac3c012454 nodeName:}" failed. No retries permitted until 2026-02-19 08:57:41.899765137 +0000 UTC m=+763.887776599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics-certs") pod "frr-k8s-h9lb6" (UID: "479438a3-ab89-4f1f-a1b8-01ac3c012454") : secret "frr-k8s-certs-secret" not found Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399813 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-metrics-certs\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.399834 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d0c713d1-6916-421f-8876-757d3d7dfa45-metallb-excludel2\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.400280 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.400318 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-conf\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.400513 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-sockets\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: E0219 08:57:41.400575 4788 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 19 08:57:41 crc kubenswrapper[4788]: E0219 08:57:41.400838 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47634f49-df39-43a3-8c1a-850fa890d4dc-cert podName:47634f49-df39-43a3-8c1a-850fa890d4dc nodeName:}" failed. No retries permitted until 2026-02-19 08:57:41.900826182 +0000 UTC m=+763.888837654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47634f49-df39-43a3-8c1a-850fa890d4dc-cert") pod "frr-k8s-webhook-server-78b44bf5bb-25kjn" (UID: "47634f49-df39-43a3-8c1a-850fa890d4dc") : secret "frr-k8s-webhook-server-cert" not found Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.401212 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/479438a3-ab89-4f1f-a1b8-01ac3c012454-frr-startup\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.427952 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6qvm\" (UniqueName: \"kubernetes.io/projected/479438a3-ab89-4f1f-a1b8-01ac3c012454-kube-api-access-d6qvm\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.427952 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jb99\" (UniqueName: \"kubernetes.io/projected/47634f49-df39-43a3-8c1a-850fa890d4dc-kube-api-access-8jb99\") pod \"frr-k8s-webhook-server-78b44bf5bb-25kjn\" (UID: \"47634f49-df39-43a3-8c1a-850fa890d4dc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.501576 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfw6t\" (UniqueName: \"kubernetes.io/projected/d0c713d1-6916-421f-8876-757d3d7dfa45-kube-api-access-lfw6t\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.501639 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-cert\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.501670 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw9rk\" (UniqueName: \"kubernetes.io/projected/bd37a23b-542c-489a-90f2-ed7b82c59ec0-kube-api-access-jw9rk\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.501712 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.501738 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-metrics-certs\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.501782 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-metrics-certs\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.501805 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d0c713d1-6916-421f-8876-757d3d7dfa45-metallb-excludel2\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: E0219 08:57:41.502237 4788 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 08:57:41 crc kubenswrapper[4788]: E0219 08:57:41.502354 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist podName:d0c713d1-6916-421f-8876-757d3d7dfa45 nodeName:}" failed. No retries permitted until 2026-02-19 08:57:42.002334819 +0000 UTC m=+763.990346291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist") pod "speaker-64rbq" (UID: "d0c713d1-6916-421f-8876-757d3d7dfa45") : secret "metallb-memberlist" not found Feb 19 08:57:41 crc kubenswrapper[4788]: E0219 08:57:41.502650 4788 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 19 08:57:41 crc kubenswrapper[4788]: E0219 08:57:41.502691 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-metrics-certs podName:bd37a23b-542c-489a-90f2-ed7b82c59ec0 nodeName:}" failed. No retries permitted until 2026-02-19 08:57:42.002680617 +0000 UTC m=+763.990692089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-metrics-certs") pod "controller-69bbfbf88f-5z7c2" (UID: "bd37a23b-542c-489a-90f2-ed7b82c59ec0") : secret "controller-certs-secret" not found Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.502734 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d0c713d1-6916-421f-8876-757d3d7dfa45-metallb-excludel2\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.507221 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-cert\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.507314 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-metrics-certs\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.520293 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw9rk\" (UniqueName: \"kubernetes.io/projected/bd37a23b-542c-489a-90f2-ed7b82c59ec0-kube-api-access-jw9rk\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.526718 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfw6t\" (UniqueName: \"kubernetes.io/projected/d0c713d1-6916-421f-8876-757d3d7dfa45-kube-api-access-lfw6t\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.907886 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics-certs\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.908083 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47634f49-df39-43a3-8c1a-850fa890d4dc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-25kjn\" (UID: \"47634f49-df39-43a3-8c1a-850fa890d4dc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.912911 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47634f49-df39-43a3-8c1a-850fa890d4dc-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-25kjn\" (UID: \"47634f49-df39-43a3-8c1a-850fa890d4dc\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:41 crc kubenswrapper[4788]: I0219 08:57:41.913538 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/479438a3-ab89-4f1f-a1b8-01ac3c012454-metrics-certs\") pod \"frr-k8s-h9lb6\" (UID: \"479438a3-ab89-4f1f-a1b8-01ac3c012454\") " pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.009433 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.009511 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-metrics-certs\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:42 crc kubenswrapper[4788]: E0219 08:57:42.010218 4788 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 19 08:57:42 crc kubenswrapper[4788]: E0219 08:57:42.010333 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist podName:d0c713d1-6916-421f-8876-757d3d7dfa45 nodeName:}" failed. No retries permitted until 2026-02-19 08:57:43.010313572 +0000 UTC m=+764.998325054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist") pod "speaker-64rbq" (UID: "d0c713d1-6916-421f-8876-757d3d7dfa45") : secret "metallb-memberlist" not found Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.015760 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd37a23b-542c-489a-90f2-ed7b82c59ec0-metrics-certs\") pod \"controller-69bbfbf88f-5z7c2\" (UID: \"bd37a23b-542c-489a-90f2-ed7b82c59ec0\") " pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.149634 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.159513 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.236519 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.513636 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-5z7c2"] Feb 19 08:57:42 crc kubenswrapper[4788]: W0219 08:57:42.519836 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd37a23b_542c_489a_90f2_ed7b82c59ec0.slice/crio-866dc087b5e9ed451dd96d8af5407d4b7e5de1057bb1174c3250618bae799883 WatchSource:0}: Error finding container 866dc087b5e9ed451dd96d8af5407d4b7e5de1057bb1174c3250618bae799883: Status 404 returned error can't find the container with id 866dc087b5e9ed451dd96d8af5407d4b7e5de1057bb1174c3250618bae799883 Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.523492 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerStarted","Data":"d8a8525c24500204ffebea39f904a7676db3e2335305dc03f57781493ba19498"} Feb 19 08:57:42 crc kubenswrapper[4788]: I0219 08:57:42.639632 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn"] Feb 19 08:57:42 crc kubenswrapper[4788]: W0219 08:57:42.649092 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47634f49_df39_43a3_8c1a_850fa890d4dc.slice/crio-d1e132f6979a7dc23a562f9d7dddcf50b30c7bc36bf16b33f788a0c458901c7b WatchSource:0}: Error finding container d1e132f6979a7dc23a562f9d7dddcf50b30c7bc36bf16b33f788a0c458901c7b: Status 404 returned error can't find the container with id d1e132f6979a7dc23a562f9d7dddcf50b30c7bc36bf16b33f788a0c458901c7b Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.022037 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.028708 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d0c713d1-6916-421f-8876-757d3d7dfa45-memberlist\") pod \"speaker-64rbq\" (UID: \"d0c713d1-6916-421f-8876-757d3d7dfa45\") " pod="metallb-system/speaker-64rbq" Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.123765 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-64rbq" Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.538362 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-64rbq" event={"ID":"d0c713d1-6916-421f-8876-757d3d7dfa45","Type":"ContainerStarted","Data":"91bd57746a9330caa945483f7463f006b97dad7772f7055c1dda7617d8ee0023"} Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.538778 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-64rbq" event={"ID":"d0c713d1-6916-421f-8876-757d3d7dfa45","Type":"ContainerStarted","Data":"a7cec5867f8e60f3797859062161431ec4abdd26505e4be7311abd245478e767"} Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.546726 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5z7c2" event={"ID":"bd37a23b-542c-489a-90f2-ed7b82c59ec0","Type":"ContainerStarted","Data":"4ce66e70b9dc519b5014d7347ed09393f270e693febca6d2d2fb6d1d2717a9f1"} Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.546773 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5z7c2" event={"ID":"bd37a23b-542c-489a-90f2-ed7b82c59ec0","Type":"ContainerStarted","Data":"c1aff9e2eb85cc5d16a2074c9839c5cf88d4b049ae9e87790a0b3f576bcce6be"} Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.546785 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-5z7c2" event={"ID":"bd37a23b-542c-489a-90f2-ed7b82c59ec0","Type":"ContainerStarted","Data":"866dc087b5e9ed451dd96d8af5407d4b7e5de1057bb1174c3250618bae799883"} Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.546851 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.552765 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" event={"ID":"47634f49-df39-43a3-8c1a-850fa890d4dc","Type":"ContainerStarted","Data":"d1e132f6979a7dc23a562f9d7dddcf50b30c7bc36bf16b33f788a0c458901c7b"} Feb 19 08:57:43 crc kubenswrapper[4788]: I0219 08:57:43.566113 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-5z7c2" podStartSLOduration=2.566094307 podStartE2EDuration="2.566094307s" podCreationTimestamp="2026-02-19 08:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:57:43.563648829 +0000 UTC m=+765.551660301" watchObservedRunningTime="2026-02-19 08:57:43.566094307 +0000 UTC m=+765.554105779" Feb 19 08:57:44 crc kubenswrapper[4788]: I0219 08:57:44.566577 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-64rbq" event={"ID":"d0c713d1-6916-421f-8876-757d3d7dfa45","Type":"ContainerStarted","Data":"14de6bd15804c45466ae723d06eae371b996c37cf161ab2730f5e8cc6aafa2c7"} Feb 19 08:57:44 crc kubenswrapper[4788]: I0219 08:57:44.592116 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-64rbq" podStartSLOduration=3.592094033 podStartE2EDuration="3.592094033s" podCreationTimestamp="2026-02-19 08:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:57:44.588287973 +0000 UTC m=+766.576299445" watchObservedRunningTime="2026-02-19 08:57:44.592094033 +0000 UTC m=+766.580105495" Feb 19 08:57:45 crc kubenswrapper[4788]: I0219 08:57:45.576534 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-64rbq" Feb 19 08:57:49 crc kubenswrapper[4788]: I0219 08:57:49.609497 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerStarted","Data":"a457206054e0d546e26087796d31bc6c430e0bcc653cfe6ad6e9e7c7484a2e72"} Feb 19 08:57:49 crc kubenswrapper[4788]: I0219 08:57:49.614154 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" event={"ID":"47634f49-df39-43a3-8c1a-850fa890d4dc","Type":"ContainerStarted","Data":"09274972d4a474775f47f161feb92ffd15ff67bfc987e7c3f2b8e1dee50a957f"} Feb 19 08:57:49 crc kubenswrapper[4788]: I0219 08:57:49.614691 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:57:49 crc kubenswrapper[4788]: I0219 08:57:49.670540 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" podStartSLOduration=1.8939605259999999 podStartE2EDuration="8.670513192s" podCreationTimestamp="2026-02-19 08:57:41 +0000 UTC" firstStartedPulling="2026-02-19 08:57:42.651231295 +0000 UTC m=+764.639242777" lastFinishedPulling="2026-02-19 08:57:49.427783971 +0000 UTC m=+771.415795443" observedRunningTime="2026-02-19 08:57:49.66194464 +0000 UTC m=+771.649956102" watchObservedRunningTime="2026-02-19 08:57:49.670513192 +0000 UTC m=+771.658524664" Feb 19 08:57:50 crc kubenswrapper[4788]: I0219 08:57:50.626319 4788 generic.go:334] "Generic (PLEG): container finished" podID="479438a3-ab89-4f1f-a1b8-01ac3c012454" containerID="a457206054e0d546e26087796d31bc6c430e0bcc653cfe6ad6e9e7c7484a2e72" exitCode=0 Feb 19 08:57:50 crc kubenswrapper[4788]: I0219 08:57:50.626367 4788 generic.go:334] "Generic (PLEG): container finished" podID="479438a3-ab89-4f1f-a1b8-01ac3c012454" containerID="bb9c4f536273fdd6b38327e351d4345a79b381ad961abdc2569867c0a1708c52" exitCode=0 Feb 19 08:57:50 crc kubenswrapper[4788]: I0219 08:57:50.626415 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerDied","Data":"a457206054e0d546e26087796d31bc6c430e0bcc653cfe6ad6e9e7c7484a2e72"} Feb 19 08:57:50 crc kubenswrapper[4788]: I0219 08:57:50.626509 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerDied","Data":"bb9c4f536273fdd6b38327e351d4345a79b381ad961abdc2569867c0a1708c52"} Feb 19 08:57:51 crc kubenswrapper[4788]: I0219 08:57:51.639397 4788 generic.go:334] "Generic (PLEG): container finished" podID="479438a3-ab89-4f1f-a1b8-01ac3c012454" containerID="ca1b91a4358829f2380ef6b5b1aed34988833353cfb4210e46ea44cd0dfc85de" exitCode=0 Feb 19 08:57:51 crc kubenswrapper[4788]: I0219 08:57:51.639462 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerDied","Data":"ca1b91a4358829f2380ef6b5b1aed34988833353cfb4210e46ea44cd0dfc85de"} Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.139821 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.139922 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.139993 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.140967 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e35f3f513564bff5ea2198c67409383ba481d995f59e4d674440d785743deb5"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.141063 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://7e35f3f513564bff5ea2198c67409383ba481d995f59e4d674440d785743deb5" gracePeriod=600 Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.241964 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-5z7c2" Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.659671 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerStarted","Data":"b36c39bdc2c5cce30e0377f74ee32fa47e22d9cf7d2da8df6db5f561b03e7f24"} Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.659727 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerStarted","Data":"4eb51c4a6ae451892eb4a39bb332a73b8e64d114a2637bbbd7f315fa5c9990c3"} Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.659741 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerStarted","Data":"7b375b18daae65af74977a2698f215b25a9cbb8ac4a2f9a47ef3fcbd2a4db155"} Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.659754 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerStarted","Data":"49f7d575e5c238be9794c19cbea086de983e6e5b9f173d1ae78312ad248545f0"} Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.659765 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerStarted","Data":"dc181d8dde77bd7341d0a9a6ec383f4850c412f156f115b1fb9780aa559f7356"} Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.666052 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="7e35f3f513564bff5ea2198c67409383ba481d995f59e4d674440d785743deb5" exitCode=0 Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.666103 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"7e35f3f513564bff5ea2198c67409383ba481d995f59e4d674440d785743deb5"} Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.666135 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"d5bb01dc9098ceeecfb8b55d79c5464f45f8f2b74f74a77633116b07488417cd"} Feb 19 08:57:52 crc kubenswrapper[4788]: I0219 08:57:52.666153 4788 scope.go:117] "RemoveContainer" containerID="2ddcd6e3366811879446e0480e0ef6c2ddb99413bd8f64b1d4a29ea61b1f6c94" Feb 19 08:57:53 crc kubenswrapper[4788]: I0219 08:57:53.133233 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-64rbq" Feb 19 08:57:53 crc kubenswrapper[4788]: I0219 08:57:53.712879 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-h9lb6" event={"ID":"479438a3-ab89-4f1f-a1b8-01ac3c012454","Type":"ContainerStarted","Data":"d3d5262c192ca80fc7f4b5902820de45cf9598aa48e940c5480b83e34e5266c9"} Feb 19 08:57:53 crc kubenswrapper[4788]: I0219 08:57:53.713161 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:53 crc kubenswrapper[4788]: I0219 08:57:53.736876 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-h9lb6" podStartSLOduration=5.714840844 podStartE2EDuration="12.736852036s" podCreationTimestamp="2026-02-19 08:57:41 +0000 UTC" firstStartedPulling="2026-02-19 08:57:42.373068157 +0000 UTC m=+764.361079629" lastFinishedPulling="2026-02-19 08:57:49.395079339 +0000 UTC m=+771.383090821" observedRunningTime="2026-02-19 08:57:53.736020666 +0000 UTC m=+775.724032188" watchObservedRunningTime="2026-02-19 08:57:53.736852036 +0000 UTC m=+775.724863528" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.190788 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dbfn2"] Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.192194 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbfn2" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.194892 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.195372 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.195784 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vrbf7" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.211499 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dbfn2"] Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.246193 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfl5v\" (UniqueName: \"kubernetes.io/projected/88b00924-b366-4bac-a420-fc82faa246ae-kube-api-access-pfl5v\") pod \"openstack-operator-index-dbfn2\" (UID: \"88b00924-b366-4bac-a420-fc82faa246ae\") " pod="openstack-operators/openstack-operator-index-dbfn2" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.347466 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfl5v\" (UniqueName: \"kubernetes.io/projected/88b00924-b366-4bac-a420-fc82faa246ae-kube-api-access-pfl5v\") pod \"openstack-operator-index-dbfn2\" (UID: \"88b00924-b366-4bac-a420-fc82faa246ae\") " pod="openstack-operators/openstack-operator-index-dbfn2" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.374967 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfl5v\" (UniqueName: \"kubernetes.io/projected/88b00924-b366-4bac-a420-fc82faa246ae-kube-api-access-pfl5v\") pod \"openstack-operator-index-dbfn2\" (UID: \"88b00924-b366-4bac-a420-fc82faa246ae\") " pod="openstack-operators/openstack-operator-index-dbfn2" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.526311 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbfn2" Feb 19 08:57:56 crc kubenswrapper[4788]: I0219 08:57:56.992259 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dbfn2"] Feb 19 08:57:57 crc kubenswrapper[4788]: W0219 08:57:57.004495 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88b00924_b366_4bac_a420_fc82faa246ae.slice/crio-9df973490152b63cc9ebae46581d66ea39569d9d58244df5a5e8867739b0dd9f WatchSource:0}: Error finding container 9df973490152b63cc9ebae46581d66ea39569d9d58244df5a5e8867739b0dd9f: Status 404 returned error can't find the container with id 9df973490152b63cc9ebae46581d66ea39569d9d58244df5a5e8867739b0dd9f Feb 19 08:57:57 crc kubenswrapper[4788]: I0219 08:57:57.160041 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:57 crc kubenswrapper[4788]: I0219 08:57:57.204583 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:57:57 crc kubenswrapper[4788]: I0219 08:57:57.746342 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbfn2" event={"ID":"88b00924-b366-4bac-a420-fc82faa246ae","Type":"ContainerStarted","Data":"9df973490152b63cc9ebae46581d66ea39569d9d58244df5a5e8867739b0dd9f"} Feb 19 08:57:59 crc kubenswrapper[4788]: I0219 08:57:59.574270 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dbfn2"] Feb 19 08:58:00 crc kubenswrapper[4788]: I0219 08:58:00.182343 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wnrq7"] Feb 19 08:58:00 crc kubenswrapper[4788]: I0219 08:58:00.183877 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:00 crc kubenswrapper[4788]: I0219 08:58:00.197887 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wnrq7"] Feb 19 08:58:00 crc kubenswrapper[4788]: I0219 08:58:00.309604 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpccf\" (UniqueName: \"kubernetes.io/projected/4b258a60-4663-4a78-9d1b-e82add2f9d42-kube-api-access-rpccf\") pod \"openstack-operator-index-wnrq7\" (UID: \"4b258a60-4663-4a78-9d1b-e82add2f9d42\") " pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:00 crc kubenswrapper[4788]: I0219 08:58:00.411765 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpccf\" (UniqueName: \"kubernetes.io/projected/4b258a60-4663-4a78-9d1b-e82add2f9d42-kube-api-access-rpccf\") pod \"openstack-operator-index-wnrq7\" (UID: \"4b258a60-4663-4a78-9d1b-e82add2f9d42\") " pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:00 crc kubenswrapper[4788]: I0219 08:58:00.441891 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpccf\" (UniqueName: \"kubernetes.io/projected/4b258a60-4663-4a78-9d1b-e82add2f9d42-kube-api-access-rpccf\") pod \"openstack-operator-index-wnrq7\" (UID: \"4b258a60-4663-4a78-9d1b-e82add2f9d42\") " pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:00 crc kubenswrapper[4788]: I0219 08:58:00.514826 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:01 crc kubenswrapper[4788]: I0219 08:58:01.656640 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wnrq7"] Feb 19 08:58:01 crc kubenswrapper[4788]: W0219 08:58:01.801658 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b258a60_4663_4a78_9d1b_e82add2f9d42.slice/crio-f2859aece991f323295722958d03dcd00efc763dfefb53f3942d880993a95bb4 WatchSource:0}: Error finding container f2859aece991f323295722958d03dcd00efc763dfefb53f3942d880993a95bb4: Status 404 returned error can't find the container with id f2859aece991f323295722958d03dcd00efc763dfefb53f3942d880993a95bb4 Feb 19 08:58:02 crc kubenswrapper[4788]: I0219 08:58:02.156615 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-25kjn" Feb 19 08:58:02 crc kubenswrapper[4788]: I0219 08:58:02.163916 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-h9lb6" Feb 19 08:58:02 crc kubenswrapper[4788]: I0219 08:58:02.784761 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbfn2" event={"ID":"88b00924-b366-4bac-a420-fc82faa246ae","Type":"ContainerStarted","Data":"79810c05d2b56cb24a7990f3c90145a578fdbe83cd059e8c5dcea1418098ebb9"} Feb 19 08:58:02 crc kubenswrapper[4788]: I0219 08:58:02.784898 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dbfn2" podUID="88b00924-b366-4bac-a420-fc82faa246ae" containerName="registry-server" containerID="cri-o://79810c05d2b56cb24a7990f3c90145a578fdbe83cd059e8c5dcea1418098ebb9" gracePeriod=2 Feb 19 08:58:02 crc kubenswrapper[4788]: I0219 08:58:02.787529 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wnrq7" event={"ID":"4b258a60-4663-4a78-9d1b-e82add2f9d42","Type":"ContainerStarted","Data":"740ff0656f0d4559b409a7a439336757dedef7a236e490d7e1c2aa0ed80ec7b2"} Feb 19 08:58:02 crc kubenswrapper[4788]: I0219 08:58:02.787569 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wnrq7" event={"ID":"4b258a60-4663-4a78-9d1b-e82add2f9d42","Type":"ContainerStarted","Data":"f2859aece991f323295722958d03dcd00efc763dfefb53f3942d880993a95bb4"} Feb 19 08:58:02 crc kubenswrapper[4788]: I0219 08:58:02.815943 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dbfn2" podStartSLOduration=1.857982653 podStartE2EDuration="6.815922829s" podCreationTimestamp="2026-02-19 08:57:56 +0000 UTC" firstStartedPulling="2026-02-19 08:57:57.006764634 +0000 UTC m=+778.994776106" lastFinishedPulling="2026-02-19 08:58:01.9647048 +0000 UTC m=+783.952716282" observedRunningTime="2026-02-19 08:58:02.813651825 +0000 UTC m=+784.801663357" watchObservedRunningTime="2026-02-19 08:58:02.815922829 +0000 UTC m=+784.803934311" Feb 19 08:58:02 crc kubenswrapper[4788]: I0219 08:58:02.842005 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wnrq7" podStartSLOduration=2.684828474 podStartE2EDuration="2.841974504s" podCreationTimestamp="2026-02-19 08:58:00 +0000 UTC" firstStartedPulling="2026-02-19 08:58:01.805808268 +0000 UTC m=+783.793819780" lastFinishedPulling="2026-02-19 08:58:01.962954318 +0000 UTC m=+783.950965810" observedRunningTime="2026-02-19 08:58:02.832762936 +0000 UTC m=+784.820774418" watchObservedRunningTime="2026-02-19 08:58:02.841974504 +0000 UTC m=+784.829986006" Feb 19 08:58:03 crc kubenswrapper[4788]: I0219 08:58:03.797941 4788 generic.go:334] "Generic (PLEG): container finished" podID="88b00924-b366-4bac-a420-fc82faa246ae" containerID="79810c05d2b56cb24a7990f3c90145a578fdbe83cd059e8c5dcea1418098ebb9" exitCode=0 Feb 19 08:58:03 crc kubenswrapper[4788]: I0219 08:58:03.798042 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbfn2" event={"ID":"88b00924-b366-4bac-a420-fc82faa246ae","Type":"ContainerDied","Data":"79810c05d2b56cb24a7990f3c90145a578fdbe83cd059e8c5dcea1418098ebb9"} Feb 19 08:58:03 crc kubenswrapper[4788]: I0219 08:58:03.875521 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbfn2" Feb 19 08:58:03 crc kubenswrapper[4788]: I0219 08:58:03.967988 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfl5v\" (UniqueName: \"kubernetes.io/projected/88b00924-b366-4bac-a420-fc82faa246ae-kube-api-access-pfl5v\") pod \"88b00924-b366-4bac-a420-fc82faa246ae\" (UID: \"88b00924-b366-4bac-a420-fc82faa246ae\") " Feb 19 08:58:03 crc kubenswrapper[4788]: I0219 08:58:03.975438 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88b00924-b366-4bac-a420-fc82faa246ae-kube-api-access-pfl5v" (OuterVolumeSpecName: "kube-api-access-pfl5v") pod "88b00924-b366-4bac-a420-fc82faa246ae" (UID: "88b00924-b366-4bac-a420-fc82faa246ae"). InnerVolumeSpecName "kube-api-access-pfl5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:58:04 crc kubenswrapper[4788]: I0219 08:58:04.069691 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfl5v\" (UniqueName: \"kubernetes.io/projected/88b00924-b366-4bac-a420-fc82faa246ae-kube-api-access-pfl5v\") on node \"crc\" DevicePath \"\"" Feb 19 08:58:04 crc kubenswrapper[4788]: I0219 08:58:04.810024 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dbfn2" event={"ID":"88b00924-b366-4bac-a420-fc82faa246ae","Type":"ContainerDied","Data":"9df973490152b63cc9ebae46581d66ea39569d9d58244df5a5e8867739b0dd9f"} Feb 19 08:58:04 crc kubenswrapper[4788]: I0219 08:58:04.810061 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dbfn2" Feb 19 08:58:04 crc kubenswrapper[4788]: I0219 08:58:04.810134 4788 scope.go:117] "RemoveContainer" containerID="79810c05d2b56cb24a7990f3c90145a578fdbe83cd059e8c5dcea1418098ebb9" Feb 19 08:58:04 crc kubenswrapper[4788]: I0219 08:58:04.839013 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dbfn2"] Feb 19 08:58:04 crc kubenswrapper[4788]: I0219 08:58:04.844726 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dbfn2"] Feb 19 08:58:06 crc kubenswrapper[4788]: I0219 08:58:06.741968 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88b00924-b366-4bac-a420-fc82faa246ae" path="/var/lib/kubelet/pods/88b00924-b366-4bac-a420-fc82faa246ae/volumes" Feb 19 08:58:10 crc kubenswrapper[4788]: I0219 08:58:10.515419 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:10 crc kubenswrapper[4788]: I0219 08:58:10.515874 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:10 crc kubenswrapper[4788]: I0219 08:58:10.563928 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:10 crc kubenswrapper[4788]: I0219 08:58:10.905169 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-wnrq7" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.028479 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc"] Feb 19 08:58:12 crc kubenswrapper[4788]: E0219 08:58:12.029382 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88b00924-b366-4bac-a420-fc82faa246ae" containerName="registry-server" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.029411 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="88b00924-b366-4bac-a420-fc82faa246ae" containerName="registry-server" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.029732 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="88b00924-b366-4bac-a420-fc82faa246ae" containerName="registry-server" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.031841 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.036903 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-4zz2t" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.039402 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc"] Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.105869 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxtbs\" (UniqueName: \"kubernetes.io/projected/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-kube-api-access-xxtbs\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.105951 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-bundle\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.106199 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-util\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.207923 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-util\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.208042 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxtbs\" (UniqueName: \"kubernetes.io/projected/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-kube-api-access-xxtbs\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.208090 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-bundle\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.208461 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-util\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.208840 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-bundle\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.232280 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxtbs\" (UniqueName: \"kubernetes.io/projected/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-kube-api-access-xxtbs\") pod \"576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.380877 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.858181 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc"] Feb 19 08:58:12 crc kubenswrapper[4788]: W0219 08:58:12.862694 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c4ff725_92d6_4b9a_88d1_bf3366ba1111.slice/crio-01002352dbae4f2739056595efc0fdb5042d541a403621926f837a6f3a5bd4e5 WatchSource:0}: Error finding container 01002352dbae4f2739056595efc0fdb5042d541a403621926f837a6f3a5bd4e5: Status 404 returned error can't find the container with id 01002352dbae4f2739056595efc0fdb5042d541a403621926f837a6f3a5bd4e5 Feb 19 08:58:12 crc kubenswrapper[4788]: I0219 08:58:12.881397 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" event={"ID":"5c4ff725-92d6-4b9a-88d1-bf3366ba1111","Type":"ContainerStarted","Data":"01002352dbae4f2739056595efc0fdb5042d541a403621926f837a6f3a5bd4e5"} Feb 19 08:58:13 crc kubenswrapper[4788]: I0219 08:58:13.891909 4788 generic.go:334] "Generic (PLEG): container finished" podID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerID="72c6737b0f89ad73c115f3a3cbf9fe8f12e46094bc963fc02b47e091fd49ca28" exitCode=0 Feb 19 08:58:13 crc kubenswrapper[4788]: I0219 08:58:13.892001 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" event={"ID":"5c4ff725-92d6-4b9a-88d1-bf3366ba1111","Type":"ContainerDied","Data":"72c6737b0f89ad73c115f3a3cbf9fe8f12e46094bc963fc02b47e091fd49ca28"} Feb 19 08:58:14 crc kubenswrapper[4788]: I0219 08:58:14.903477 4788 generic.go:334] "Generic (PLEG): container finished" podID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerID="ba66b7ccb6371492209bb42250fe27217dcfbd1ec2c834bf3337bb2e475588f3" exitCode=0 Feb 19 08:58:14 crc kubenswrapper[4788]: I0219 08:58:14.903593 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" event={"ID":"5c4ff725-92d6-4b9a-88d1-bf3366ba1111","Type":"ContainerDied","Data":"ba66b7ccb6371492209bb42250fe27217dcfbd1ec2c834bf3337bb2e475588f3"} Feb 19 08:58:15 crc kubenswrapper[4788]: I0219 08:58:15.913496 4788 generic.go:334] "Generic (PLEG): container finished" podID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerID="5b101a9be1ff881b205efb0e8203fd19233128b7b66c9c8eda4895b20bd77c11" exitCode=0 Feb 19 08:58:15 crc kubenswrapper[4788]: I0219 08:58:15.913570 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" event={"ID":"5c4ff725-92d6-4b9a-88d1-bf3366ba1111","Type":"ContainerDied","Data":"5b101a9be1ff881b205efb0e8203fd19233128b7b66c9c8eda4895b20bd77c11"} Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.178371 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.283203 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxtbs\" (UniqueName: \"kubernetes.io/projected/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-kube-api-access-xxtbs\") pod \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.283446 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-bundle\") pod \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.283504 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-util\") pod \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\" (UID: \"5c4ff725-92d6-4b9a-88d1-bf3366ba1111\") " Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.284498 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-bundle" (OuterVolumeSpecName: "bundle") pod "5c4ff725-92d6-4b9a-88d1-bf3366ba1111" (UID: "5c4ff725-92d6-4b9a-88d1-bf3366ba1111"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.290562 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-kube-api-access-xxtbs" (OuterVolumeSpecName: "kube-api-access-xxtbs") pod "5c4ff725-92d6-4b9a-88d1-bf3366ba1111" (UID: "5c4ff725-92d6-4b9a-88d1-bf3366ba1111"). InnerVolumeSpecName "kube-api-access-xxtbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.312616 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-util" (OuterVolumeSpecName: "util") pod "5c4ff725-92d6-4b9a-88d1-bf3366ba1111" (UID: "5c4ff725-92d6-4b9a-88d1-bf3366ba1111"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.385484 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxtbs\" (UniqueName: \"kubernetes.io/projected/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-kube-api-access-xxtbs\") on node \"crc\" DevicePath \"\"" Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.385547 4788 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.385571 4788 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4ff725-92d6-4b9a-88d1-bf3366ba1111-util\") on node \"crc\" DevicePath \"\"" Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.933662 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" event={"ID":"5c4ff725-92d6-4b9a-88d1-bf3366ba1111","Type":"ContainerDied","Data":"01002352dbae4f2739056595efc0fdb5042d541a403621926f837a6f3a5bd4e5"} Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.934189 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01002352dbae4f2739056595efc0fdb5042d541a403621926f837a6f3a5bd4e5" Feb 19 08:58:17 crc kubenswrapper[4788]: I0219 08:58:17.933714 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.117092 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp"] Feb 19 08:58:24 crc kubenswrapper[4788]: E0219 08:58:24.117655 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerName="util" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.117671 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerName="util" Feb 19 08:58:24 crc kubenswrapper[4788]: E0219 08:58:24.117703 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerName="extract" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.117711 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerName="extract" Feb 19 08:58:24 crc kubenswrapper[4788]: E0219 08:58:24.117720 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerName="pull" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.117728 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerName="pull" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.117857 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4ff725-92d6-4b9a-88d1-bf3366ba1111" containerName="extract" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.118230 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.123533 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-49rsb" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.145928 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp"] Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.285177 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcpkz\" (UniqueName: \"kubernetes.io/projected/5b19ebbb-b6af-4e2a-8af4-718efe39cd68-kube-api-access-xcpkz\") pod \"openstack-operator-controller-init-5c66fdff94-lm8dp\" (UID: \"5b19ebbb-b6af-4e2a-8af4-718efe39cd68\") " pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.386456 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcpkz\" (UniqueName: \"kubernetes.io/projected/5b19ebbb-b6af-4e2a-8af4-718efe39cd68-kube-api-access-xcpkz\") pod \"openstack-operator-controller-init-5c66fdff94-lm8dp\" (UID: \"5b19ebbb-b6af-4e2a-8af4-718efe39cd68\") " pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.411321 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcpkz\" (UniqueName: \"kubernetes.io/projected/5b19ebbb-b6af-4e2a-8af4-718efe39cd68-kube-api-access-xcpkz\") pod \"openstack-operator-controller-init-5c66fdff94-lm8dp\" (UID: \"5b19ebbb-b6af-4e2a-8af4-718efe39cd68\") " pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.438964 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.640984 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp"] Feb 19 08:58:24 crc kubenswrapper[4788]: I0219 08:58:24.990732 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" event={"ID":"5b19ebbb-b6af-4e2a-8af4-718efe39cd68","Type":"ContainerStarted","Data":"37195e0152a98d374f3c70a3f08065ef21df4a55af203020d53b09aee25d37ad"} Feb 19 08:58:29 crc kubenswrapper[4788]: I0219 08:58:29.018594 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" event={"ID":"5b19ebbb-b6af-4e2a-8af4-718efe39cd68","Type":"ContainerStarted","Data":"a83538135d82179254b70fa9d5a026ede7ab725afbcacfc9f4ce10f57063ac69"} Feb 19 08:58:29 crc kubenswrapper[4788]: I0219 08:58:29.019212 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" Feb 19 08:58:29 crc kubenswrapper[4788]: I0219 08:58:29.069135 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" podStartSLOduration=1.361805666 podStartE2EDuration="5.069117331s" podCreationTimestamp="2026-02-19 08:58:24 +0000 UTC" firstStartedPulling="2026-02-19 08:58:24.663334844 +0000 UTC m=+806.651346316" lastFinishedPulling="2026-02-19 08:58:28.370646489 +0000 UTC m=+810.358657981" observedRunningTime="2026-02-19 08:58:29.065500816 +0000 UTC m=+811.053512318" watchObservedRunningTime="2026-02-19 08:58:29.069117331 +0000 UTC m=+811.057128803" Feb 19 08:58:34 crc kubenswrapper[4788]: I0219 08:58:34.443122 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5c66fdff94-lm8dp" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.872165 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z"] Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.873739 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.875624 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jfl2g" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.878710 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk"] Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.881327 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.888590 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk"] Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.897824 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8rw2f" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.906631 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z"] Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.912338 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh"] Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.913164 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.918220 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh"] Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.921431 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sthq4\" (UniqueName: \"kubernetes.io/projected/c4a6e8c2-5708-45ec-8cd7-08d552abbe53-kube-api-access-sthq4\") pod \"barbican-operator-controller-manager-868647ff47-8jh9z\" (UID: \"c4a6e8c2-5708-45ec-8cd7-08d552abbe53\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.921546 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvdxn\" (UniqueName: \"kubernetes.io/projected/3e3f67ff-5285-401a-a19e-2476a8334248-kube-api-access-lvdxn\") pod \"cinder-operator-controller-manager-5d946d989d-tn2mk\" (UID: \"3e3f67ff-5285-401a-a19e-2476a8334248\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.974175 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-p9sk5" Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.991991 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-74624"] Feb 19 08:58:53 crc kubenswrapper[4788]: I0219 08:58:53.992786 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.004035 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qc5bm" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.005676 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-74624"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.024958 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sthq4\" (UniqueName: \"kubernetes.io/projected/c4a6e8c2-5708-45ec-8cd7-08d552abbe53-kube-api-access-sthq4\") pod \"barbican-operator-controller-manager-868647ff47-8jh9z\" (UID: \"c4a6e8c2-5708-45ec-8cd7-08d552abbe53\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.025033 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkkf\" (UniqueName: \"kubernetes.io/projected/0da97318-27f4-465d-91c6-c44004a9e291-kube-api-access-wdkkf\") pod \"designate-operator-controller-manager-6d8bf5c495-bp9gh\" (UID: \"0da97318-27f4-465d-91c6-c44004a9e291\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.025092 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqvlb\" (UniqueName: \"kubernetes.io/projected/17fe7557-8cf3-4f24-86a3-993037455f15-kube-api-access-nqvlb\") pod \"glance-operator-controller-manager-77987464f4-74624\" (UID: \"17fe7557-8cf3-4f24-86a3-993037455f15\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.025163 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvdxn\" (UniqueName: \"kubernetes.io/projected/3e3f67ff-5285-401a-a19e-2476a8334248-kube-api-access-lvdxn\") pod \"cinder-operator-controller-manager-5d946d989d-tn2mk\" (UID: \"3e3f67ff-5285-401a-a19e-2476a8334248\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.030768 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.031533 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.041715 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8mk9c" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.057531 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.059968 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvdxn\" (UniqueName: \"kubernetes.io/projected/3e3f67ff-5285-401a-a19e-2476a8334248-kube-api-access-lvdxn\") pod \"cinder-operator-controller-manager-5d946d989d-tn2mk\" (UID: \"3e3f67ff-5285-401a-a19e-2476a8334248\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.072897 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sthq4\" (UniqueName: \"kubernetes.io/projected/c4a6e8c2-5708-45ec-8cd7-08d552abbe53-kube-api-access-sthq4\") pod \"barbican-operator-controller-manager-868647ff47-8jh9z\" (UID: \"c4a6e8c2-5708-45ec-8cd7-08d552abbe53\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.074972 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.075471 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.075948 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.075989 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.075945 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.079564 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.079812 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-q2jtx" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.080111 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-bh6gj" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.081378 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2b7qf" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.094717 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.095546 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.098017 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-npqsc" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.101401 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.108989 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.116624 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.126115 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.126168 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl4t9\" (UniqueName: \"kubernetes.io/projected/8156be62-dae2-4105-9c97-7cdd398e1eb4-kube-api-access-wl4t9\") pod \"horizon-operator-controller-manager-5b9b8895d5-rq69q\" (UID: \"8156be62-dae2-4105-9c97-7cdd398e1eb4\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.126195 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkkf\" (UniqueName: \"kubernetes.io/projected/0da97318-27f4-465d-91c6-c44004a9e291-kube-api-access-wdkkf\") pod \"designate-operator-controller-manager-6d8bf5c495-bp9gh\" (UID: \"0da97318-27f4-465d-91c6-c44004a9e291\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.126215 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rphhk\" (UniqueName: \"kubernetes.io/projected/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-kube-api-access-rphhk\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.126234 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnl94\" (UniqueName: \"kubernetes.io/projected/cb9194bc-2a08-4c61-9302-14d3ab1b731a-kube-api-access-mnl94\") pod \"keystone-operator-controller-manager-b4d948c87-6zwrs\" (UID: \"cb9194bc-2a08-4c61-9302-14d3ab1b731a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.126284 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9rr\" (UniqueName: \"kubernetes.io/projected/e87d14f5-ac68-489e-a79c-d9962b5786e9-kube-api-access-ms9rr\") pod \"ironic-operator-controller-manager-554564d7fc-754h5\" (UID: \"e87d14f5-ac68-489e-a79c-d9962b5786e9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.126302 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqvlb\" (UniqueName: \"kubernetes.io/projected/17fe7557-8cf3-4f24-86a3-993037455f15-kube-api-access-nqvlb\") pod \"glance-operator-controller-manager-77987464f4-74624\" (UID: \"17fe7557-8cf3-4f24-86a3-993037455f15\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.133712 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2scs\" (UniqueName: \"kubernetes.io/projected/acc6ac6b-da33-4eb0-a2b9-33b6e45a118e-kube-api-access-z2scs\") pod \"heat-operator-controller-manager-57cc58f5d8-gxbgk\" (UID: \"acc6ac6b-da33-4eb0-a2b9-33b6e45a118e\") " pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.134874 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.136269 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.139782 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-2dbfn" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.162790 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkkf\" (UniqueName: \"kubernetes.io/projected/0da97318-27f4-465d-91c6-c44004a9e291-kube-api-access-wdkkf\") pod \"designate-operator-controller-manager-6d8bf5c495-bp9gh\" (UID: \"0da97318-27f4-465d-91c6-c44004a9e291\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.163758 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqvlb\" (UniqueName: \"kubernetes.io/projected/17fe7557-8cf3-4f24-86a3-993037455f15-kube-api-access-nqvlb\") pod \"glance-operator-controller-manager-77987464f4-74624\" (UID: \"17fe7557-8cf3-4f24-86a3-993037455f15\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.171586 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.178427 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.179257 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.180603 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rkg7j" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.183558 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.184315 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.186789 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-9bfmj" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.196308 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.198714 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.199556 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.201562 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-k8s5l" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.205114 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.205800 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.207417 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-zw7jg" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.214687 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.230315 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235343 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2scs\" (UniqueName: \"kubernetes.io/projected/acc6ac6b-da33-4eb0-a2b9-33b6e45a118e-kube-api-access-z2scs\") pod \"heat-operator-controller-manager-57cc58f5d8-gxbgk\" (UID: \"acc6ac6b-da33-4eb0-a2b9-33b6e45a118e\") " pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235419 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235450 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n55hh\" (UniqueName: \"kubernetes.io/projected/c01b7431-8487-45ac-9b7b-c1ec5dc115f0-kube-api-access-n55hh\") pod \"octavia-operator-controller-manager-69f8888797-ddzgn\" (UID: \"c01b7431-8487-45ac-9b7b-c1ec5dc115f0\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235485 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl4t9\" (UniqueName: \"kubernetes.io/projected/8156be62-dae2-4105-9c97-7cdd398e1eb4-kube-api-access-wl4t9\") pod \"horizon-operator-controller-manager-5b9b8895d5-rq69q\" (UID: \"8156be62-dae2-4105-9c97-7cdd398e1eb4\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235512 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88vn\" (UniqueName: \"kubernetes.io/projected/04be53c8-52b4-43fd-9cab-f1484fd17140-kube-api-access-n88vn\") pod \"neutron-operator-controller-manager-64ddbf8bb-4j2v9\" (UID: \"04be53c8-52b4-43fd-9cab-f1484fd17140\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235537 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw9x4\" (UniqueName: \"kubernetes.io/projected/226ea1a2-f858-46df-8a62-12f5c41da0c5-kube-api-access-cw9x4\") pod \"nova-operator-controller-manager-567668f5cf-9sr69\" (UID: \"226ea1a2-f858-46df-8a62-12f5c41da0c5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235657 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rphhk\" (UniqueName: \"kubernetes.io/projected/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-kube-api-access-rphhk\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235690 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnl94\" (UniqueName: \"kubernetes.io/projected/cb9194bc-2a08-4c61-9302-14d3ab1b731a-kube-api-access-mnl94\") pod \"keystone-operator-controller-manager-b4d948c87-6zwrs\" (UID: \"cb9194bc-2a08-4c61-9302-14d3ab1b731a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235724 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9h9l\" (UniqueName: \"kubernetes.io/projected/6a57c46b-96e1-4c3a-aede-8b9ced264828-kube-api-access-h9h9l\") pod \"mariadb-operator-controller-manager-6994f66f48-r2rc5\" (UID: \"6a57c46b-96e1-4c3a-aede-8b9ced264828\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235755 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9rr\" (UniqueName: \"kubernetes.io/projected/e87d14f5-ac68-489e-a79c-d9962b5786e9-kube-api-access-ms9rr\") pod \"ironic-operator-controller-manager-554564d7fc-754h5\" (UID: \"e87d14f5-ac68-489e-a79c-d9962b5786e9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.235794 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7gk4\" (UniqueName: \"kubernetes.io/projected/53d52f5e-a729-4d03-b949-7ccb6719754c-kube-api-access-z7gk4\") pod \"manila-operator-controller-manager-54f6768c69-q6x4r\" (UID: \"53d52f5e-a729-4d03-b949-7ccb6719754c\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.236184 4788 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.236258 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert podName:bc6d475e-ace7-47ba-a9ba-cb493c7225c9 nodeName:}" failed. No retries permitted until 2026-02-19 08:58:54.736220762 +0000 UTC m=+836.724232244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert") pod "infra-operator-controller-manager-79d975b745-ccpr8" (UID: "bc6d475e-ace7-47ba-a9ba-cb493c7225c9") : secret "infra-operator-webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.238064 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.248323 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.254534 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.255107 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2scs\" (UniqueName: \"kubernetes.io/projected/acc6ac6b-da33-4eb0-a2b9-33b6e45a118e-kube-api-access-z2scs\") pod \"heat-operator-controller-manager-57cc58f5d8-gxbgk\" (UID: \"acc6ac6b-da33-4eb0-a2b9-33b6e45a118e\") " pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.255886 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rphhk\" (UniqueName: \"kubernetes.io/projected/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-kube-api-access-rphhk\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.260567 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9rr\" (UniqueName: \"kubernetes.io/projected/e87d14f5-ac68-489e-a79c-d9962b5786e9-kube-api-access-ms9rr\") pod \"ironic-operator-controller-manager-554564d7fc-754h5\" (UID: \"e87d14f5-ac68-489e-a79c-d9962b5786e9\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.263018 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.264366 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnl94\" (UniqueName: \"kubernetes.io/projected/cb9194bc-2a08-4c61-9302-14d3ab1b731a-kube-api-access-mnl94\") pod \"keystone-operator-controller-manager-b4d948c87-6zwrs\" (UID: \"cb9194bc-2a08-4c61-9302-14d3ab1b731a\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.263828 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.265157 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.266622 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-272mp" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.270689 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.271550 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.273879 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.275434 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl4t9\" (UniqueName: \"kubernetes.io/projected/8156be62-dae2-4105-9c97-7cdd398e1eb4-kube-api-access-wl4t9\") pod \"horizon-operator-controller-manager-5b9b8895d5-rq69q\" (UID: \"8156be62-dae2-4105-9c97-7cdd398e1eb4\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.275736 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2xldr" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.277278 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.278056 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.284917 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-96xrv" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.298357 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.301408 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.313488 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.318654 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.326724 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.327354 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337192 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j59hv\" (UniqueName: \"kubernetes.io/projected/87f1ff04-454b-4c9d-82e6-5e7239c63978-kube-api-access-j59hv\") pod \"placement-operator-controller-manager-8497b45c89-fjt75\" (UID: \"87f1ff04-454b-4c9d-82e6-5e7239c63978\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337230 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337265 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n55hh\" (UniqueName: \"kubernetes.io/projected/c01b7431-8487-45ac-9b7b-c1ec5dc115f0-kube-api-access-n55hh\") pod \"octavia-operator-controller-manager-69f8888797-ddzgn\" (UID: \"c01b7431-8487-45ac-9b7b-c1ec5dc115f0\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337300 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88vn\" (UniqueName: \"kubernetes.io/projected/04be53c8-52b4-43fd-9cab-f1484fd17140-kube-api-access-n88vn\") pod \"neutron-operator-controller-manager-64ddbf8bb-4j2v9\" (UID: \"04be53c8-52b4-43fd-9cab-f1484fd17140\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337318 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw9x4\" (UniqueName: \"kubernetes.io/projected/226ea1a2-f858-46df-8a62-12f5c41da0c5-kube-api-access-cw9x4\") pod \"nova-operator-controller-manager-567668f5cf-9sr69\" (UID: \"226ea1a2-f858-46df-8a62-12f5c41da0c5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337349 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9h9l\" (UniqueName: \"kubernetes.io/projected/6a57c46b-96e1-4c3a-aede-8b9ced264828-kube-api-access-h9h9l\") pod \"mariadb-operator-controller-manager-6994f66f48-r2rc5\" (UID: \"6a57c46b-96e1-4c3a-aede-8b9ced264828\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337371 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slrqd\" (UniqueName: \"kubernetes.io/projected/91c888b5-7f35-4049-830e-855914654f90-kube-api-access-slrqd\") pod \"ovn-operator-controller-manager-d44cf6b75-dlfjs\" (UID: \"91c888b5-7f35-4049-830e-855914654f90\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337399 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7gk4\" (UniqueName: \"kubernetes.io/projected/53d52f5e-a729-4d03-b949-7ccb6719754c-kube-api-access-z7gk4\") pod \"manila-operator-controller-manager-54f6768c69-q6x4r\" (UID: \"53d52f5e-a729-4d03-b949-7ccb6719754c\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.337424 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vfcz\" (UniqueName: \"kubernetes.io/projected/cad2b3a6-6577-4136-bf9e-213884d94b31-kube-api-access-7vfcz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.354475 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9h9l\" (UniqueName: \"kubernetes.io/projected/6a57c46b-96e1-4c3a-aede-8b9ced264828-kube-api-access-h9h9l\") pod \"mariadb-operator-controller-manager-6994f66f48-r2rc5\" (UID: \"6a57c46b-96e1-4c3a-aede-8b9ced264828\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.358040 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.359766 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7gk4\" (UniqueName: \"kubernetes.io/projected/53d52f5e-a729-4d03-b949-7ccb6719754c-kube-api-access-z7gk4\") pod \"manila-operator-controller-manager-54f6768c69-q6x4r\" (UID: \"53d52f5e-a729-4d03-b949-7ccb6719754c\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.360435 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88vn\" (UniqueName: \"kubernetes.io/projected/04be53c8-52b4-43fd-9cab-f1484fd17140-kube-api-access-n88vn\") pod \"neutron-operator-controller-manager-64ddbf8bb-4j2v9\" (UID: \"04be53c8-52b4-43fd-9cab-f1484fd17140\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.363750 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw9x4\" (UniqueName: \"kubernetes.io/projected/226ea1a2-f858-46df-8a62-12f5c41da0c5-kube-api-access-cw9x4\") pod \"nova-operator-controller-manager-567668f5cf-9sr69\" (UID: \"226ea1a2-f858-46df-8a62-12f5c41da0c5\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.364567 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n55hh\" (UniqueName: \"kubernetes.io/projected/c01b7431-8487-45ac-9b7b-c1ec5dc115f0-kube-api-access-n55hh\") pod \"octavia-operator-controller-manager-69f8888797-ddzgn\" (UID: \"c01b7431-8487-45ac-9b7b-c1ec5dc115f0\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.381700 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.382758 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.386098 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rk4l2" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.401929 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.428349 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.439992 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slrqd\" (UniqueName: \"kubernetes.io/projected/91c888b5-7f35-4049-830e-855914654f90-kube-api-access-slrqd\") pod \"ovn-operator-controller-manager-d44cf6b75-dlfjs\" (UID: \"91c888b5-7f35-4049-830e-855914654f90\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.440057 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vfcz\" (UniqueName: \"kubernetes.io/projected/cad2b3a6-6577-4136-bf9e-213884d94b31-kube-api-access-7vfcz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.440083 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffn4p\" (UniqueName: \"kubernetes.io/projected/777de642-ca99-4f43-b282-5c9703e97dfe-kube-api-access-ffn4p\") pod \"swift-operator-controller-manager-68f46476f-ncsrp\" (UID: \"777de642-ca99-4f43-b282-5c9703e97dfe\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.440132 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j59hv\" (UniqueName: \"kubernetes.io/projected/87f1ff04-454b-4c9d-82e6-5e7239c63978-kube-api-access-j59hv\") pod \"placement-operator-controller-manager-8497b45c89-fjt75\" (UID: \"87f1ff04-454b-4c9d-82e6-5e7239c63978\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.440155 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.440622 4788 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.441554 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert podName:cad2b3a6-6577-4136-bf9e-213884d94b31 nodeName:}" failed. No retries permitted until 2026-02-19 08:58:54.9415345 +0000 UTC m=+836.929545972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" (UID: "cad2b3a6-6577-4136-bf9e-213884d94b31") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.474061 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.474987 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.479554 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vspkq" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.480797 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j59hv\" (UniqueName: \"kubernetes.io/projected/87f1ff04-454b-4c9d-82e6-5e7239c63978-kube-api-access-j59hv\") pod \"placement-operator-controller-manager-8497b45c89-fjt75\" (UID: \"87f1ff04-454b-4c9d-82e6-5e7239c63978\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.480861 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.482951 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vfcz\" (UniqueName: \"kubernetes.io/projected/cad2b3a6-6577-4136-bf9e-213884d94b31-kube-api-access-7vfcz\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.491195 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slrqd\" (UniqueName: \"kubernetes.io/projected/91c888b5-7f35-4049-830e-855914654f90-kube-api-access-slrqd\") pod \"ovn-operator-controller-manager-d44cf6b75-dlfjs\" (UID: \"91c888b5-7f35-4049-830e-855914654f90\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.502006 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.514412 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.523285 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.530275 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-g7mw7"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.532193 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.539300 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.539546 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7fgt2" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.541092 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffn4p\" (UniqueName: \"kubernetes.io/projected/777de642-ca99-4f43-b282-5c9703e97dfe-kube-api-access-ffn4p\") pod \"swift-operator-controller-manager-68f46476f-ncsrp\" (UID: \"777de642-ca99-4f43-b282-5c9703e97dfe\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.541300 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrb2g\" (UniqueName: \"kubernetes.io/projected/52b85fb1-54ad-440a-9c9d-d9969f34f1c7-kube-api-access-nrb2g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-pbtll\" (UID: \"52b85fb1-54ad-440a-9c9d-d9969f34f1c7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.543211 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-g7mw7"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.552826 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.573355 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.574801 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffn4p\" (UniqueName: \"kubernetes.io/projected/777de642-ca99-4f43-b282-5c9703e97dfe-kube-api-access-ffn4p\") pod \"swift-operator-controller-manager-68f46476f-ncsrp\" (UID: \"777de642-ca99-4f43-b282-5c9703e97dfe\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.576609 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.580156 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-n7s2p" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.586813 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.587230 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.598665 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.610531 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.629903 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.631727 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.632454 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.640653 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.640802 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.641357 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7hnp6" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.641980 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrb2g\" (UniqueName: \"kubernetes.io/projected/52b85fb1-54ad-440a-9c9d-d9969f34f1c7-kube-api-access-nrb2g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-pbtll\" (UID: \"52b85fb1-54ad-440a-9c9d-d9969f34f1c7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.642099 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-774d4\" (UniqueName: \"kubernetes.io/projected/7d01364a-9507-4d68-bb0b-efbb67fe2e48-kube-api-access-774d4\") pod \"watcher-operator-controller-manager-5db88f68c-6dvcl\" (UID: \"7d01364a-9507-4d68-bb0b-efbb67fe2e48\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.642205 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss65h\" (UniqueName: \"kubernetes.io/projected/aa0d4ba2-0512-42e8-8c66-137bf969f706-kube-api-access-ss65h\") pod \"test-operator-controller-manager-7866795846-g7mw7\" (UID: \"aa0d4ba2-0512-42e8-8c66-137bf969f706\") " pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.652043 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.665167 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrb2g\" (UniqueName: \"kubernetes.io/projected/52b85fb1-54ad-440a-9c9d-d9969f34f1c7-kube-api-access-nrb2g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-pbtll\" (UID: \"52b85fb1-54ad-440a-9c9d-d9969f34f1c7\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.684319 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.685148 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.685192 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.688796 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vhvkr" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.711695 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.744117 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-774d4\" (UniqueName: \"kubernetes.io/projected/7d01364a-9507-4d68-bb0b-efbb67fe2e48-kube-api-access-774d4\") pod \"watcher-operator-controller-manager-5db88f68c-6dvcl\" (UID: \"7d01364a-9507-4d68-bb0b-efbb67fe2e48\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.744188 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.744222 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.744286 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss65h\" (UniqueName: \"kubernetes.io/projected/aa0d4ba2-0512-42e8-8c66-137bf969f706-kube-api-access-ss65h\") pod \"test-operator-controller-manager-7866795846-g7mw7\" (UID: \"aa0d4ba2-0512-42e8-8c66-137bf969f706\") " pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.744311 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w7tb\" (UniqueName: \"kubernetes.io/projected/d337efba-1c27-47ac-bdd7-17c3848678cb-kube-api-access-8w7tb\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.744364 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.744384 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd72k\" (UniqueName: \"kubernetes.io/projected/cfd2e1e0-4b24-4e57-8a3d-779a03729f0e-kube-api-access-dd72k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gb54d\" (UID: \"cfd2e1e0-4b24-4e57-8a3d-779a03729f0e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d" Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.744550 4788 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.744617 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert podName:bc6d475e-ace7-47ba-a9ba-cb493c7225c9 nodeName:}" failed. No retries permitted until 2026-02-19 08:58:55.744600886 +0000 UTC m=+837.732612358 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert") pod "infra-operator-controller-manager-79d975b745-ccpr8" (UID: "bc6d475e-ace7-47ba-a9ba-cb493c7225c9") : secret "infra-operator-webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.754642 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.763198 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss65h\" (UniqueName: \"kubernetes.io/projected/aa0d4ba2-0512-42e8-8c66-137bf969f706-kube-api-access-ss65h\") pod \"test-operator-controller-manager-7866795846-g7mw7\" (UID: \"aa0d4ba2-0512-42e8-8c66-137bf969f706\") " pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.776924 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-774d4\" (UniqueName: \"kubernetes.io/projected/7d01364a-9507-4d68-bb0b-efbb67fe2e48-kube-api-access-774d4\") pod \"watcher-operator-controller-manager-5db88f68c-6dvcl\" (UID: \"7d01364a-9507-4d68-bb0b-efbb67fe2e48\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.808322 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.846817 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.846871 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w7tb\" (UniqueName: \"kubernetes.io/projected/d337efba-1c27-47ac-bdd7-17c3848678cb-kube-api-access-8w7tb\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.846905 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd72k\" (UniqueName: \"kubernetes.io/projected/cfd2e1e0-4b24-4e57-8a3d-779a03729f0e-kube-api-access-dd72k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gb54d\" (UID: \"cfd2e1e0-4b24-4e57-8a3d-779a03729f0e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.846920 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.847081 4788 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.847125 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:58:55.347112496 +0000 UTC m=+837.335123968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "metrics-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.847347 4788 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.847369 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:58:55.347362392 +0000 UTC m=+837.335373864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.885199 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w7tb\" (UniqueName: \"kubernetes.io/projected/d337efba-1c27-47ac-bdd7-17c3848678cb-kube-api-access-8w7tb\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.885483 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.889081 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd72k\" (UniqueName: \"kubernetes.io/projected/cfd2e1e0-4b24-4e57-8a3d-779a03729f0e-kube-api-access-dd72k\") pod \"rabbitmq-cluster-operator-manager-668c99d594-gb54d\" (UID: \"cfd2e1e0-4b24-4e57-8a3d-779a03729f0e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.924549 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.944685 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z"] Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.948892 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.949083 4788 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: E0219 08:58:54.949132 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert podName:cad2b3a6-6577-4136-bf9e-213884d94b31 nodeName:}" failed. No retries permitted until 2026-02-19 08:58:55.949118395 +0000 UTC m=+837.937129867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" (UID: "cad2b3a6-6577-4136-bf9e-213884d94b31") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:58:54 crc kubenswrapper[4788]: I0219 08:58:54.989437 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-74624"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.079392 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d" Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.161732 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.185116 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.223115 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" event={"ID":"17fe7557-8cf3-4f24-86a3-993037455f15","Type":"ContainerStarted","Data":"80d45a1729d5c7fae8e68b7835e5f8fed296553da84bc3e67c1aca9f5d4ced08"} Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.224942 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" event={"ID":"c4a6e8c2-5708-45ec-8cd7-08d552abbe53","Type":"ContainerStarted","Data":"e445cf14c23b3958cf9354a7529f70e0091dd30e308fe9b2da0648a5b6c278d9"} Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.229188 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" event={"ID":"3e3f67ff-5285-401a-a19e-2476a8334248","Type":"ContainerStarted","Data":"179e07e06100c74708b0781ac6f9069d3c6cd58ad9db709ddebf2a21cf0f7b09"} Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.351708 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.356567 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.358836 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.358965 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.359172 4788 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.359225 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:58:56.359211188 +0000 UTC m=+838.347222660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "webhook-server-cert" not found Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.359598 4788 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.359628 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:58:56.359618727 +0000 UTC m=+838.347630199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "metrics-server-cert" not found Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.382652 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87d14f5_ac68_489e_a79c_d9962b5786e9.slice/crio-23498abdf594bb19b4e4ea9030ac59d2fb63af699bb06c79ceb0b859a64145f5 WatchSource:0}: Error finding container 23498abdf594bb19b4e4ea9030ac59d2fb63af699bb06c79ceb0b859a64145f5: Status 404 returned error can't find the container with id 23498abdf594bb19b4e4ea9030ac59d2fb63af699bb06c79ceb0b859a64145f5 Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.492710 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.504817 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5"] Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.505562 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8156be62_dae2_4105_9c97_7cdd398e1eb4.slice/crio-3e557818ad1f38301e14acc2ed379c8c71ac3b686c16ec52d40dc26c8d04414e WatchSource:0}: Error finding container 3e557818ad1f38301e14acc2ed379c8c71ac3b686c16ec52d40dc26c8d04414e: Status 404 returned error can't find the container with id 3e557818ad1f38301e14acc2ed379c8c71ac3b686c16ec52d40dc26c8d04414e Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.506989 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a57c46b_96e1_4c3a_aede_8b9ced264828.slice/crio-2b22213860b812bd1425ab4fa39e743a2fb8d4360b6f98cc3e62ffd4ed8190e0 WatchSource:0}: Error finding container 2b22213860b812bd1425ab4fa39e743a2fb8d4360b6f98cc3e62ffd4ed8190e0: Status 404 returned error can't find the container with id 2b22213860b812bd1425ab4fa39e743a2fb8d4360b6f98cc3e62ffd4ed8190e0 Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.515086 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69"] Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.515358 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod226ea1a2_f858_46df_8a62_12f5c41da0c5.slice/crio-eb1040d67847fc3145bdec42f46dcd2c2e6cf1a1f91ac2d097322db0499b7625 WatchSource:0}: Error finding container eb1040d67847fc3145bdec42f46dcd2c2e6cf1a1f91ac2d097322db0499b7625: Status 404 returned error can't find the container with id eb1040d67847fc3145bdec42f46dcd2c2e6cf1a1f91ac2d097322db0499b7625 Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.520062 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.654690 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.659415 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.667502 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs"] Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.672309 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c888b5_7f35_4049_830e_855914654f90.slice/crio-5e238fceab1e54a63f6e3d02e4de2ffb91711084d0fbe83dae892b289451f8d6 WatchSource:0}: Error finding container 5e238fceab1e54a63f6e3d02e4de2ffb91711084d0fbe83dae892b289451f8d6: Status 404 returned error can't find the container with id 5e238fceab1e54a63f6e3d02e4de2ffb91711084d0fbe83dae892b289451f8d6 Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.672938 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r"] Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.676862 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75"] Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.678135 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b85fb1_54ad_440a_9c9d_d9969f34f1c7.slice/crio-e9eda45c6bc0f99628f9addc35ae79b63673ed2e19e2f7078373b8aee87099c7 WatchSource:0}: Error finding container e9eda45c6bc0f99628f9addc35ae79b63673ed2e19e2f7078373b8aee87099c7: Status 404 returned error can't find the container with id e9eda45c6bc0f99628f9addc35ae79b63673ed2e19e2f7078373b8aee87099c7 Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.683369 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp"] Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.695877 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04be53c8_52b4_43fd_9cab_f1484fd17140.slice/crio-d4ee9a84666153e97ac0bda4d8823deace092a33f0de0f4767648aedcd7e2502 WatchSource:0}: Error finding container d4ee9a84666153e97ac0bda4d8823deace092a33f0de0f4767648aedcd7e2502: Status 404 returned error can't find the container with id d4ee9a84666153e97ac0bda4d8823deace092a33f0de0f4767648aedcd7e2502 Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.702432 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87f1ff04_454b_4c9d_82e6_5e7239c63978.slice/crio-d39364a9a956b540f5f44350451fffd984b8639b286d4c2165f5573c1f8b9737 WatchSource:0}: Error finding container d39364a9a956b540f5f44350451fffd984b8639b286d4c2165f5573c1f8b9737: Status 404 returned error can't find the container with id d39364a9a956b540f5f44350451fffd984b8639b286d4c2165f5573c1f8b9737 Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.704155 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j59hv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-fjt75_openstack-operators(87f1ff04-454b-4c9d-82e6-5e7239c63978): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.705099 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d52f5e_a729_4d03_b949_7ccb6719754c.slice/crio-066be8097acb805705706458eda48b1446f467749443c52d9d28112247cc8880 WatchSource:0}: Error finding container 066be8097acb805705706458eda48b1446f467749443c52d9d28112247cc8880: Status 404 returned error can't find the container with id 066be8097acb805705706458eda48b1446f467749443c52d9d28112247cc8880 Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.705813 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" podUID="87f1ff04-454b-4c9d-82e6-5e7239c63978" Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.709306 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod777de642_ca99_4f43_b282_5c9703e97dfe.slice/crio-2df960bb113cd5ffcd20c1fd77fa52431b0d14d2e60dc3ba2b14539aa6373163 WatchSource:0}: Error finding container 2df960bb113cd5ffcd20c1fd77fa52431b0d14d2e60dc3ba2b14539aa6373163: Status 404 returned error can't find the container with id 2df960bb113cd5ffcd20c1fd77fa52431b0d14d2e60dc3ba2b14539aa6373163 Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.710489 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z7gk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-q6x4r_openstack-operators(53d52f5e-a729-4d03-b949-7ccb6719754c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.711576 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" podUID="53d52f5e-a729-4d03-b949-7ccb6719754c" Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.711968 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ffn4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-ncsrp_openstack-operators(777de642-ca99-4f43-b282-5c9703e97dfe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.713593 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" podUID="777de642-ca99-4f43-b282-5c9703e97dfe" Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.764562 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.764724 4788 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.764802 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert podName:bc6d475e-ace7-47ba-a9ba-cb493c7225c9 nodeName:}" failed. No retries permitted until 2026-02-19 08:58:57.764787094 +0000 UTC m=+839.752798566 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert") pod "infra-operator-controller-manager-79d975b745-ccpr8" (UID: "bc6d475e-ace7-47ba-a9ba-cb493c7225c9") : secret "infra-operator-webhook-server-cert" not found Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.819386 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d"] Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.821307 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd2e1e0_4b24_4e57_8a3d_779a03729f0e.slice/crio-958afc91f7e1577e46110d64c7de9daa9b1970aba2a1fd48ad9bc2d16dd1c98c WatchSource:0}: Error finding container 958afc91f7e1577e46110d64c7de9daa9b1970aba2a1fd48ad9bc2d16dd1c98c: Status 404 returned error can't find the container with id 958afc91f7e1577e46110d64c7de9daa9b1970aba2a1fd48ad9bc2d16dd1c98c Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.823406 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d01364a_9507_4d68_bb0b_efbb67fe2e48.slice/crio-fd6f148751d8615f76a5d393475dee892353ebf9d9100430184d3893496586ee WatchSource:0}: Error finding container fd6f148751d8615f76a5d393475dee892353ebf9d9100430184d3893496586ee: Status 404 returned error can't find the container with id fd6f148751d8615f76a5d393475dee892353ebf9d9100430184d3893496586ee Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.824630 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl"] Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.825877 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-774d4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-6dvcl_openstack-operators(7d01364a-9507-4d68-bb0b-efbb67fe2e48): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.827013 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" podUID="7d01364a-9507-4d68-bb0b-efbb67fe2e48" Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.828715 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-g7mw7"] Feb 19 08:58:55 crc kubenswrapper[4788]: W0219 08:58:55.830648 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa0d4ba2_0512_42e8_8c66_137bf969f706.slice/crio-e26624c2f69226f642272412a7690a151dcd36fcd75a13f4e8254275fb0b0ebf WatchSource:0}: Error finding container e26624c2f69226f642272412a7690a151dcd36fcd75a13f4e8254275fb0b0ebf: Status 404 returned error can't find the container with id e26624c2f69226f642272412a7690a151dcd36fcd75a13f4e8254275fb0b0ebf Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.833963 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ss65h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-g7mw7_openstack-operators(aa0d4ba2-0512-42e8-8c66-137bf969f706): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.835114 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" podUID="aa0d4ba2-0512-42e8-8c66-137bf969f706" Feb 19 08:58:55 crc kubenswrapper[4788]: I0219 08:58:55.967136 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.967391 4788 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:58:55 crc kubenswrapper[4788]: E0219 08:58:55.967536 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert podName:cad2b3a6-6577-4136-bf9e-213884d94b31 nodeName:}" failed. No retries permitted until 2026-02-19 08:58:57.967514351 +0000 UTC m=+839.955525823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" (UID: "cad2b3a6-6577-4136-bf9e-213884d94b31") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.235826 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" event={"ID":"04be53c8-52b4-43fd-9cab-f1484fd17140","Type":"ContainerStarted","Data":"d4ee9a84666153e97ac0bda4d8823deace092a33f0de0f4767648aedcd7e2502"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.237527 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" event={"ID":"6a57c46b-96e1-4c3a-aede-8b9ced264828","Type":"ContainerStarted","Data":"2b22213860b812bd1425ab4fa39e743a2fb8d4360b6f98cc3e62ffd4ed8190e0"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.239321 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" event={"ID":"cb9194bc-2a08-4c61-9302-14d3ab1b731a","Type":"ContainerStarted","Data":"3f4295dc7288aedcc5d1bc0b88fa4c563fd107a395d2fd3a4b10df5381b7fa18"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.241690 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" event={"ID":"8156be62-dae2-4105-9c97-7cdd398e1eb4","Type":"ContainerStarted","Data":"3e557818ad1f38301e14acc2ed379c8c71ac3b686c16ec52d40dc26c8d04414e"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.244360 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" event={"ID":"91c888b5-7f35-4049-830e-855914654f90","Type":"ContainerStarted","Data":"5e238fceab1e54a63f6e3d02e4de2ffb91711084d0fbe83dae892b289451f8d6"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.245362 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" event={"ID":"e87d14f5-ac68-489e-a79c-d9962b5786e9","Type":"ContainerStarted","Data":"23498abdf594bb19b4e4ea9030ac59d2fb63af699bb06c79ceb0b859a64145f5"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.246446 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" event={"ID":"87f1ff04-454b-4c9d-82e6-5e7239c63978","Type":"ContainerStarted","Data":"d39364a9a956b540f5f44350451fffd984b8639b286d4c2165f5573c1f8b9737"} Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.247889 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" podUID="87f1ff04-454b-4c9d-82e6-5e7239c63978" Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.248655 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" event={"ID":"0da97318-27f4-465d-91c6-c44004a9e291","Type":"ContainerStarted","Data":"7fa9f5cc4301558236b293e12fe962fe916ecac94fd1d6fe75d158d0b0806191"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.286696 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" event={"ID":"226ea1a2-f858-46df-8a62-12f5c41da0c5","Type":"ContainerStarted","Data":"eb1040d67847fc3145bdec42f46dcd2c2e6cf1a1f91ac2d097322db0499b7625"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.342539 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" event={"ID":"acc6ac6b-da33-4eb0-a2b9-33b6e45a118e","Type":"ContainerStarted","Data":"848f9130f7e3752d1c841b77b1d742174ea1193b53ab900845c7a403c4e84f04"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.374327 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.374428 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.374592 4788 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.374656 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:58:58.374642864 +0000 UTC m=+840.362654336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "metrics-server-cert" not found Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.374805 4788 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.374907 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:58:58.37488418 +0000 UTC m=+840.362895712 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "webhook-server-cert" not found Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.380864 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" event={"ID":"aa0d4ba2-0512-42e8-8c66-137bf969f706","Type":"ContainerStarted","Data":"e26624c2f69226f642272412a7690a151dcd36fcd75a13f4e8254275fb0b0ebf"} Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.394509 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" podUID="aa0d4ba2-0512-42e8-8c66-137bf969f706" Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.396969 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d" event={"ID":"cfd2e1e0-4b24-4e57-8a3d-779a03729f0e","Type":"ContainerStarted","Data":"958afc91f7e1577e46110d64c7de9daa9b1970aba2a1fd48ad9bc2d16dd1c98c"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.401083 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" event={"ID":"7d01364a-9507-4d68-bb0b-efbb67fe2e48","Type":"ContainerStarted","Data":"fd6f148751d8615f76a5d393475dee892353ebf9d9100430184d3893496586ee"} Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.408573 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" podUID="7d01364a-9507-4d68-bb0b-efbb67fe2e48" Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.409281 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" event={"ID":"52b85fb1-54ad-440a-9c9d-d9969f34f1c7","Type":"ContainerStarted","Data":"e9eda45c6bc0f99628f9addc35ae79b63673ed2e19e2f7078373b8aee87099c7"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.410160 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" event={"ID":"c01b7431-8487-45ac-9b7b-c1ec5dc115f0","Type":"ContainerStarted","Data":"2ddd7d0807c29d6db4923d7505997b8a300a82dff34032320ba7157f28265577"} Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.418318 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" event={"ID":"777de642-ca99-4f43-b282-5c9703e97dfe","Type":"ContainerStarted","Data":"2df960bb113cd5ffcd20c1fd77fa52431b0d14d2e60dc3ba2b14539aa6373163"} Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.423085 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" podUID="777de642-ca99-4f43-b282-5c9703e97dfe" Feb 19 08:58:56 crc kubenswrapper[4788]: I0219 08:58:56.423220 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" event={"ID":"53d52f5e-a729-4d03-b949-7ccb6719754c","Type":"ContainerStarted","Data":"066be8097acb805705706458eda48b1446f467749443c52d9d28112247cc8880"} Feb 19 08:58:56 crc kubenswrapper[4788]: E0219 08:58:56.424260 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" podUID="53d52f5e-a729-4d03-b949-7ccb6719754c" Feb 19 08:58:57 crc kubenswrapper[4788]: E0219 08:58:57.436105 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" podUID="87f1ff04-454b-4c9d-82e6-5e7239c63978" Feb 19 08:58:57 crc kubenswrapper[4788]: E0219 08:58:57.436686 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" podUID="777de642-ca99-4f43-b282-5c9703e97dfe" Feb 19 08:58:57 crc kubenswrapper[4788]: E0219 08:58:57.436730 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" podUID="53d52f5e-a729-4d03-b949-7ccb6719754c" Feb 19 08:58:57 crc kubenswrapper[4788]: E0219 08:58:57.436733 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" podUID="aa0d4ba2-0512-42e8-8c66-137bf969f706" Feb 19 08:58:57 crc kubenswrapper[4788]: E0219 08:58:57.436789 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" podUID="7d01364a-9507-4d68-bb0b-efbb67fe2e48" Feb 19 08:58:57 crc kubenswrapper[4788]: I0219 08:58:57.849682 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:58:57 crc kubenswrapper[4788]: E0219 08:58:57.849919 4788 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 08:58:57 crc kubenswrapper[4788]: E0219 08:58:57.850212 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert podName:bc6d475e-ace7-47ba-a9ba-cb493c7225c9 nodeName:}" failed. No retries permitted until 2026-02-19 08:59:01.850193494 +0000 UTC m=+843.838204976 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert") pod "infra-operator-controller-manager-79d975b745-ccpr8" (UID: "bc6d475e-ace7-47ba-a9ba-cb493c7225c9") : secret "infra-operator-webhook-server-cert" not found Feb 19 08:58:58 crc kubenswrapper[4788]: I0219 08:58:58.052843 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:58:58 crc kubenswrapper[4788]: E0219 08:58:58.053008 4788 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:58:58 crc kubenswrapper[4788]: E0219 08:58:58.053084 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert podName:cad2b3a6-6577-4136-bf9e-213884d94b31 nodeName:}" failed. No retries permitted until 2026-02-19 08:59:02.053066724 +0000 UTC m=+844.041078196 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" (UID: "cad2b3a6-6577-4136-bf9e-213884d94b31") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:58:58 crc kubenswrapper[4788]: I0219 08:58:58.458747 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:58 crc kubenswrapper[4788]: I0219 08:58:58.458863 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:58:58 crc kubenswrapper[4788]: E0219 08:58:58.459016 4788 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 08:58:58 crc kubenswrapper[4788]: E0219 08:58:58.459077 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:59:02.4590582 +0000 UTC m=+844.447069682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "metrics-server-cert" not found Feb 19 08:58:58 crc kubenswrapper[4788]: E0219 08:58:58.459192 4788 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 08:58:58 crc kubenswrapper[4788]: E0219 08:58:58.459372 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:59:02.459333357 +0000 UTC m=+844.447344859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "webhook-server-cert" not found Feb 19 08:59:01 crc kubenswrapper[4788]: I0219 08:59:01.912081 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:59:01 crc kubenswrapper[4788]: E0219 08:59:01.912533 4788 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 19 08:59:01 crc kubenswrapper[4788]: E0219 08:59:01.912638 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert podName:bc6d475e-ace7-47ba-a9ba-cb493c7225c9 nodeName:}" failed. No retries permitted until 2026-02-19 08:59:09.912616885 +0000 UTC m=+851.900628367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert") pod "infra-operator-controller-manager-79d975b745-ccpr8" (UID: "bc6d475e-ace7-47ba-a9ba-cb493c7225c9") : secret "infra-operator-webhook-server-cert" not found Feb 19 08:59:02 crc kubenswrapper[4788]: I0219 08:59:02.115322 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:59:02 crc kubenswrapper[4788]: E0219 08:59:02.115551 4788 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:59:02 crc kubenswrapper[4788]: E0219 08:59:02.115700 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert podName:cad2b3a6-6577-4136-bf9e-213884d94b31 nodeName:}" failed. No retries permitted until 2026-02-19 08:59:10.115684609 +0000 UTC m=+852.103696081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" (UID: "cad2b3a6-6577-4136-bf9e-213884d94b31") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 19 08:59:02 crc kubenswrapper[4788]: I0219 08:59:02.522375 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:02 crc kubenswrapper[4788]: I0219 08:59:02.522525 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:02 crc kubenswrapper[4788]: E0219 08:59:02.522635 4788 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 08:59:02 crc kubenswrapper[4788]: E0219 08:59:02.522650 4788 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 19 08:59:02 crc kubenswrapper[4788]: E0219 08:59:02.522795 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:59:10.52272295 +0000 UTC m=+852.510734442 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "webhook-server-cert" not found Feb 19 08:59:02 crc kubenswrapper[4788]: E0219 08:59:02.522824 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:59:10.522811223 +0000 UTC m=+852.510822695 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "metrics-server-cert" not found Feb 19 08:59:07 crc kubenswrapper[4788]: E0219 08:59:07.976229 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc" Feb 19 08:59:07 crc kubenswrapper[4788]: E0219 08:59:07.976715 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sthq4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-868647ff47-8jh9z_openstack-operators(c4a6e8c2-5708-45ec-8cd7-08d552abbe53): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 08:59:07 crc kubenswrapper[4788]: E0219 08:59:07.977950 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" podUID="c4a6e8c2-5708-45ec-8cd7-08d552abbe53" Feb 19 08:59:08 crc kubenswrapper[4788]: E0219 08:59:08.457927 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 19 08:59:08 crc kubenswrapper[4788]: E0219 08:59:08.458116 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mnl94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-6zwrs_openstack-operators(cb9194bc-2a08-4c61-9302-14d3ab1b731a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 08:59:08 crc kubenswrapper[4788]: E0219 08:59:08.459294 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" podUID="cb9194bc-2a08-4c61-9302-14d3ab1b731a" Feb 19 08:59:08 crc kubenswrapper[4788]: E0219 08:59:08.506176 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:90ad8fd8c1889b6be77925016532218eb6149d2c1c8535a5f9f1775c776fa6cc\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" podUID="c4a6e8c2-5708-45ec-8cd7-08d552abbe53" Feb 19 08:59:08 crc kubenswrapper[4788]: E0219 08:59:08.507120 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" podUID="cb9194bc-2a08-4c61-9302-14d3ab1b731a" Feb 19 08:59:09 crc kubenswrapper[4788]: E0219 08:59:09.011890 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 19 08:59:09 crc kubenswrapper[4788]: E0219 08:59:09.012323 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cw9x4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-9sr69_openstack-operators(226ea1a2-f858-46df-8a62-12f5c41da0c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 08:59:09 crc kubenswrapper[4788]: E0219 08:59:09.014064 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" podUID="226ea1a2-f858-46df-8a62-12f5c41da0c5" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.521235 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" event={"ID":"acc6ac6b-da33-4eb0-a2b9-33b6e45a118e","Type":"ContainerStarted","Data":"d1f519c68e87c60761e6910145746bdd4c3b3432cc06e7ead4ac8dfec4920756"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.522172 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.529045 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" event={"ID":"8156be62-dae2-4105-9c97-7cdd398e1eb4","Type":"ContainerStarted","Data":"82d4bed39fbeef30796aa6ee18859003c007513addb2bfcfa55b2b1814afdfc9"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.529695 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.535886 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d" event={"ID":"cfd2e1e0-4b24-4e57-8a3d-779a03729f0e","Type":"ContainerStarted","Data":"98a744cbf67fe8b3d4095827ffa76c3f2aba844342fec43da3d8b0c8d697943a"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.537681 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" event={"ID":"e87d14f5-ac68-489e-a79c-d9962b5786e9","Type":"ContainerStarted","Data":"155ec1077cfa3d59f2eca4c048a90af9129b6bd6d6323fe7c1b6453cad82cbba"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.537854 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.540896 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" podStartSLOduration=2.755458964 podStartE2EDuration="16.540882151s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.202896377 +0000 UTC m=+837.190907849" lastFinishedPulling="2026-02-19 08:59:08.988319564 +0000 UTC m=+850.976331036" observedRunningTime="2026-02-19 08:59:09.538485265 +0000 UTC m=+851.526496737" watchObservedRunningTime="2026-02-19 08:59:09.540882151 +0000 UTC m=+851.528893623" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.542079 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" event={"ID":"17fe7557-8cf3-4f24-86a3-993037455f15","Type":"ContainerStarted","Data":"915fe37f3f640e920d256d6d86e94927c4f12d578fa6c4c74b47d03ced58e953"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.542775 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.547869 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" event={"ID":"04be53c8-52b4-43fd-9cab-f1484fd17140","Type":"ContainerStarted","Data":"0632dfda0f6b65e9c750cc5341326631b25abbf478a74cde0b8ab8840b1cfefd"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.548237 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.557258 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" podStartSLOduration=3.077473497 podStartE2EDuration="16.557232687s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.508101513 +0000 UTC m=+837.496112985" lastFinishedPulling="2026-02-19 08:59:08.987860703 +0000 UTC m=+850.975872175" observedRunningTime="2026-02-19 08:59:09.554080893 +0000 UTC m=+851.542092365" watchObservedRunningTime="2026-02-19 08:59:09.557232687 +0000 UTC m=+851.545244149" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.559705 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" event={"ID":"6a57c46b-96e1-4c3a-aede-8b9ced264828","Type":"ContainerStarted","Data":"e988eab8f228fef05719fbb1355353f3238fa524d906899fa1e60e8d0c7f572b"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.559850 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.565733 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" event={"ID":"0da97318-27f4-465d-91c6-c44004a9e291","Type":"ContainerStarted","Data":"764c99bd90ba5beaa44a91e8b5a0a8af5de6093e92f947c0f6d8067b8a75ee98"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.565985 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.574873 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" event={"ID":"91c888b5-7f35-4049-830e-855914654f90","Type":"ContainerStarted","Data":"4ea2eaaefb62f802527d563518fe9895a95608727649bd6ba4ae39b8d0c8340f"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.575004 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.577696 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" event={"ID":"52b85fb1-54ad-440a-9c9d-d9969f34f1c7","Type":"ContainerStarted","Data":"ce46e7cfa7a98f2b3b0ac9c0507fac4e6a230332f18267a68b9aef5d2506a659"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.578083 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.581760 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-gb54d" podStartSLOduration=2.418273814 podStartE2EDuration="15.581747216s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.824386301 +0000 UTC m=+837.812397773" lastFinishedPulling="2026-02-19 08:59:08.987859703 +0000 UTC m=+850.975871175" observedRunningTime="2026-02-19 08:59:09.576668986 +0000 UTC m=+851.564680458" watchObservedRunningTime="2026-02-19 08:59:09.581747216 +0000 UTC m=+851.569758688" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.585938 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" event={"ID":"c01b7431-8487-45ac-9b7b-c1ec5dc115f0","Type":"ContainerStarted","Data":"5e690bf28e1241555e5d18994f555da62a437997e4b2a163f0003b14342bef60"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.586597 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.593420 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" event={"ID":"3e3f67ff-5285-401a-a19e-2476a8334248","Type":"ContainerStarted","Data":"c9698d3866367099aa470b391dc73a778e60314087339b46e98d87c4456165ce"} Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.593517 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" Feb 19 08:59:09 crc kubenswrapper[4788]: E0219 08:59:09.594777 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" podUID="226ea1a2-f858-46df-8a62-12f5c41da0c5" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.597841 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" podStartSLOduration=2.998472272 podStartE2EDuration="16.597827166s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.388406107 +0000 UTC m=+837.376417579" lastFinishedPulling="2026-02-19 08:59:08.987761011 +0000 UTC m=+850.975772473" observedRunningTime="2026-02-19 08:59:09.596000483 +0000 UTC m=+851.584011955" watchObservedRunningTime="2026-02-19 08:59:09.597827166 +0000 UTC m=+851.585838638" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.615582 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" podStartSLOduration=2.305745227 podStartE2EDuration="15.615565584s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.676905209 +0000 UTC m=+837.664916681" lastFinishedPulling="2026-02-19 08:59:08.986725566 +0000 UTC m=+850.974737038" observedRunningTime="2026-02-19 08:59:09.61451722 +0000 UTC m=+851.602528692" watchObservedRunningTime="2026-02-19 08:59:09.615565584 +0000 UTC m=+851.603577056" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.641895 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" podStartSLOduration=2.347665047 podStartE2EDuration="15.641879266s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.694280719 +0000 UTC m=+837.682292191" lastFinishedPulling="2026-02-19 08:59:08.988494938 +0000 UTC m=+850.976506410" observedRunningTime="2026-02-19 08:59:09.638162168 +0000 UTC m=+851.626173640" watchObservedRunningTime="2026-02-19 08:59:09.641879266 +0000 UTC m=+851.629890738" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.673818 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" podStartSLOduration=2.8907190270000003 podStartE2EDuration="16.673796939s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.2051566 +0000 UTC m=+837.193168072" lastFinishedPulling="2026-02-19 08:59:08.988234512 +0000 UTC m=+850.976245984" observedRunningTime="2026-02-19 08:59:09.662976894 +0000 UTC m=+851.650988366" watchObservedRunningTime="2026-02-19 08:59:09.673796939 +0000 UTC m=+851.661808411" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.700996 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" podStartSLOduration=2.412771644 podStartE2EDuration="15.700975251s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.699147684 +0000 UTC m=+837.687159156" lastFinishedPulling="2026-02-19 08:59:08.987351291 +0000 UTC m=+850.975362763" observedRunningTime="2026-02-19 08:59:09.697685904 +0000 UTC m=+851.685697386" watchObservedRunningTime="2026-02-19 08:59:09.700975251 +0000 UTC m=+851.688986723" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.732989 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" podStartSLOduration=2.271044029 podStartE2EDuration="15.732968527s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.524799918 +0000 UTC m=+837.512811390" lastFinishedPulling="2026-02-19 08:59:08.986724416 +0000 UTC m=+850.974735888" observedRunningTime="2026-02-19 08:59:09.721830894 +0000 UTC m=+851.709842376" watchObservedRunningTime="2026-02-19 08:59:09.732968527 +0000 UTC m=+851.720979999" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.952597 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.958801 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc6d475e-ace7-47ba-a9ba-cb493c7225c9-cert\") pod \"infra-operator-controller-manager-79d975b745-ccpr8\" (UID: \"bc6d475e-ace7-47ba-a9ba-cb493c7225c9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.983495 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" podStartSLOduration=2.505548056 podStartE2EDuration="15.983478202s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.509506187 +0000 UTC m=+837.497517659" lastFinishedPulling="2026-02-19 08:59:08.987436333 +0000 UTC m=+850.975447805" observedRunningTime="2026-02-19 08:59:09.98297497 +0000 UTC m=+851.970986442" watchObservedRunningTime="2026-02-19 08:59:09.983478202 +0000 UTC m=+851.971489674" Feb 19 08:59:09 crc kubenswrapper[4788]: I0219 08:59:09.984589 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" podStartSLOduration=3.348952498 podStartE2EDuration="16.984583338s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:58:54.795035167 +0000 UTC m=+836.783046639" lastFinishedPulling="2026-02-19 08:59:08.430666007 +0000 UTC m=+850.418677479" observedRunningTime="2026-02-19 08:59:09.839559073 +0000 UTC m=+851.827570545" watchObservedRunningTime="2026-02-19 08:59:09.984583338 +0000 UTC m=+851.972594810" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.050319 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" podStartSLOduration=3.103356979 podStartE2EDuration="17.05030508s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.043728029 +0000 UTC m=+837.031739501" lastFinishedPulling="2026-02-19 08:59:08.99067613 +0000 UTC m=+850.978687602" observedRunningTime="2026-02-19 08:59:10.044554514 +0000 UTC m=+852.032565996" watchObservedRunningTime="2026-02-19 08:59:10.05030508 +0000 UTC m=+852.038316552" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.057976 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-q2jtx" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.060057 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.155905 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.170913 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cad2b3a6-6577-4136-bf9e-213884d94b31-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg\" (UID: \"cad2b3a6-6577-4136-bf9e-213884d94b31\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.224923 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2xldr" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.232336 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.573971 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.574037 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:10 crc kubenswrapper[4788]: E0219 08:59:10.574391 4788 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 19 08:59:10 crc kubenswrapper[4788]: E0219 08:59:10.574463 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs podName:d337efba-1c27-47ac-bdd7-17c3848678cb nodeName:}" failed. No retries permitted until 2026-02-19 08:59:26.574445726 +0000 UTC m=+868.562457188 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs") pod "openstack-operator-controller-manager-6774fbc4bc-5gddb" (UID: "d337efba-1c27-47ac-bdd7-17c3848678cb") : secret "webhook-server-cert" not found Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.579845 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-metrics-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:10 crc kubenswrapper[4788]: I0219 08:59:10.906696 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg"] Feb 19 08:59:10 crc kubenswrapper[4788]: W0219 08:59:10.913084 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad2b3a6_6577_4136_bf9e_213884d94b31.slice/crio-e11c16354903253e7ef484541b97f79f75e35b47914ec7f81d209fcf44dc6950 WatchSource:0}: Error finding container e11c16354903253e7ef484541b97f79f75e35b47914ec7f81d209fcf44dc6950: Status 404 returned error can't find the container with id e11c16354903253e7ef484541b97f79f75e35b47914ec7f81d209fcf44dc6950 Feb 19 08:59:11 crc kubenswrapper[4788]: I0219 08:59:11.002448 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8"] Feb 19 08:59:11 crc kubenswrapper[4788]: W0219 08:59:11.020567 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc6d475e_ace7_47ba_a9ba_cb493c7225c9.slice/crio-d8db0c4eb228a1b7787a547c6ea17cd3ce805dd1c1c673e430f330aa5b62955e WatchSource:0}: Error finding container d8db0c4eb228a1b7787a547c6ea17cd3ce805dd1c1c673e430f330aa5b62955e: Status 404 returned error can't find the container with id d8db0c4eb228a1b7787a547c6ea17cd3ce805dd1c1c673e430f330aa5b62955e Feb 19 08:59:11 crc kubenswrapper[4788]: I0219 08:59:11.640406 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" event={"ID":"bc6d475e-ace7-47ba-a9ba-cb493c7225c9","Type":"ContainerStarted","Data":"d8db0c4eb228a1b7787a547c6ea17cd3ce805dd1c1c673e430f330aa5b62955e"} Feb 19 08:59:11 crc kubenswrapper[4788]: I0219 08:59:11.642771 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" event={"ID":"cad2b3a6-6577-4136-bf9e-213884d94b31","Type":"ContainerStarted","Data":"e11c16354903253e7ef484541b97f79f75e35b47914ec7f81d209fcf44dc6950"} Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.303221 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-tn2mk" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.316393 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-bp9gh" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.329488 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-74624" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.375471 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-57cc58f5d8-gxbgk" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.517141 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-754h5" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.533906 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-rq69q" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.542457 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-r2rc5" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.562573 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-4j2v9" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.609740 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ddzgn" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.645596 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-dlfjs" Feb 19 08:59:14 crc kubenswrapper[4788]: I0219 08:59:14.811792 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-pbtll" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.679770 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" event={"ID":"bc6d475e-ace7-47ba-a9ba-cb493c7225c9","Type":"ContainerStarted","Data":"30eaa179d4deba19a31b1275695cb1debfb691cd42beb2c619d9c764d400d29e"} Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.681205 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.682735 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" event={"ID":"cad2b3a6-6577-4136-bf9e-213884d94b31","Type":"ContainerStarted","Data":"88f0a7aacd18834c1e67188cc69af0f93fd6e86b2151df04be08a160781d6ce3"} Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.683310 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.685017 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" event={"ID":"7d01364a-9507-4d68-bb0b-efbb67fe2e48","Type":"ContainerStarted","Data":"c692f228a1ff9af5434e16a501fd901024acc2fe5407879c5a1d9e2c54ee9979"} Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.685267 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.686965 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" event={"ID":"aa0d4ba2-0512-42e8-8c66-137bf969f706","Type":"ContainerStarted","Data":"d53015b784daf603277dfdc408ca9b3355b961be731e6826efbfe4c23db9c505"} Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.687105 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.688113 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" event={"ID":"777de642-ca99-4f43-b282-5c9703e97dfe","Type":"ContainerStarted","Data":"12666c67b8268b1fc67cefa14e1b595a8fd95d4e6fa8c4690f7a04bf1fa41e52"} Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.688462 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.689918 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" event={"ID":"87f1ff04-454b-4c9d-82e6-5e7239c63978","Type":"ContainerStarted","Data":"ac91f2edb5353f994f2116b1289a69b69d8471a3f944286e26ffdea4cd9855ad"} Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.690293 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.691606 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" event={"ID":"53d52f5e-a729-4d03-b949-7ccb6719754c","Type":"ContainerStarted","Data":"e4b6887bab5cb562ed1ff406b5161304a19425c0d3527f327bd043b643d113c0"} Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.691945 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.705789 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" podStartSLOduration=18.731129437 podStartE2EDuration="24.705774549s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:59:11.024028871 +0000 UTC m=+853.012040343" lastFinishedPulling="2026-02-19 08:59:16.998673983 +0000 UTC m=+858.986685455" observedRunningTime="2026-02-19 08:59:17.698778604 +0000 UTC m=+859.686790066" watchObservedRunningTime="2026-02-19 08:59:17.705774549 +0000 UTC m=+859.693786021" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.717169 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" podStartSLOduration=2.562280405 podStartE2EDuration="23.717153658s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.825763044 +0000 UTC m=+837.813774516" lastFinishedPulling="2026-02-19 08:59:16.980636297 +0000 UTC m=+858.968647769" observedRunningTime="2026-02-19 08:59:17.715298614 +0000 UTC m=+859.703310086" watchObservedRunningTime="2026-02-19 08:59:17.717153658 +0000 UTC m=+859.705165130" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.735854 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" podStartSLOduration=2.427194916 podStartE2EDuration="23.735835659s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.70402874 +0000 UTC m=+837.692040212" lastFinishedPulling="2026-02-19 08:59:17.012669473 +0000 UTC m=+859.000680955" observedRunningTime="2026-02-19 08:59:17.734431366 +0000 UTC m=+859.722442838" watchObservedRunningTime="2026-02-19 08:59:17.735835659 +0000 UTC m=+859.723847131" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.759550 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" podStartSLOduration=17.695612559 podStartE2EDuration="23.759534828s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:59:10.916418041 +0000 UTC m=+852.904429513" lastFinishedPulling="2026-02-19 08:59:16.98034031 +0000 UTC m=+858.968351782" observedRunningTime="2026-02-19 08:59:17.754619412 +0000 UTC m=+859.742630894" watchObservedRunningTime="2026-02-19 08:59:17.759534828 +0000 UTC m=+859.747546300" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.774526 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" podStartSLOduration=2.684114031 podStartE2EDuration="23.774512812s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.833813824 +0000 UTC m=+837.821825296" lastFinishedPulling="2026-02-19 08:59:16.924212605 +0000 UTC m=+858.912224077" observedRunningTime="2026-02-19 08:59:17.77232462 +0000 UTC m=+859.760336092" watchObservedRunningTime="2026-02-19 08:59:17.774512812 +0000 UTC m=+859.762524284" Feb 19 08:59:17 crc kubenswrapper[4788]: I0219 08:59:17.791081 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" podStartSLOduration=2.51505804 podStartE2EDuration="23.791065393s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.71039465 +0000 UTC m=+837.698406122" lastFinishedPulling="2026-02-19 08:59:16.986401963 +0000 UTC m=+858.974413475" observedRunningTime="2026-02-19 08:59:17.787920059 +0000 UTC m=+859.775931531" watchObservedRunningTime="2026-02-19 08:59:17.791065393 +0000 UTC m=+859.779076865" Feb 19 08:59:22 crc kubenswrapper[4788]: I0219 08:59:22.779170 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" podStartSLOduration=7.504220032 podStartE2EDuration="28.779145919s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.711822184 +0000 UTC m=+837.699833656" lastFinishedPulling="2026-02-19 08:59:16.986748071 +0000 UTC m=+858.974759543" observedRunningTime="2026-02-19 08:59:17.817671051 +0000 UTC m=+859.805682523" watchObservedRunningTime="2026-02-19 08:59:22.779145919 +0000 UTC m=+864.767157391" Feb 19 08:59:24 crc kubenswrapper[4788]: I0219 08:59:24.527310 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-q6x4r" Feb 19 08:59:24 crc kubenswrapper[4788]: I0219 08:59:24.632622 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-fjt75" Feb 19 08:59:24 crc kubenswrapper[4788]: I0219 08:59:24.723799 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-ncsrp" Feb 19 08:59:24 crc kubenswrapper[4788]: I0219 08:59:24.888414 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-g7mw7" Feb 19 08:59:24 crc kubenswrapper[4788]: I0219 08:59:24.931152 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-6dvcl" Feb 19 08:59:26 crc kubenswrapper[4788]: I0219 08:59:26.612819 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:26 crc kubenswrapper[4788]: I0219 08:59:26.619680 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d337efba-1c27-47ac-bdd7-17c3848678cb-webhook-certs\") pod \"openstack-operator-controller-manager-6774fbc4bc-5gddb\" (UID: \"d337efba-1c27-47ac-bdd7-17c3848678cb\") " pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:26 crc kubenswrapper[4788]: I0219 08:59:26.843684 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7hnp6" Feb 19 08:59:26 crc kubenswrapper[4788]: I0219 08:59:26.852071 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:27 crc kubenswrapper[4788]: I0219 08:59:27.281758 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb"] Feb 19 08:59:27 crc kubenswrapper[4788]: I0219 08:59:27.781858 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" event={"ID":"d337efba-1c27-47ac-bdd7-17c3848678cb","Type":"ContainerStarted","Data":"a3a962b85f037425447c032d91e1f6bb9a865716f7c182a0d0b501bd4e220ebe"} Feb 19 08:59:30 crc kubenswrapper[4788]: I0219 08:59:30.064878 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-ccpr8" Feb 19 08:59:30 crc kubenswrapper[4788]: I0219 08:59:30.246656 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg" Feb 19 08:59:30 crc kubenswrapper[4788]: I0219 08:59:30.802579 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" event={"ID":"d337efba-1c27-47ac-bdd7-17c3848678cb","Type":"ContainerStarted","Data":"0eb69379790461057ae0c1fd420af9ae6ad5c9c4d2975af555ec376515983faa"} Feb 19 08:59:30 crc kubenswrapper[4788]: I0219 08:59:30.802670 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:30 crc kubenswrapper[4788]: I0219 08:59:30.839201 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" podStartSLOduration=36.83917631 podStartE2EDuration="36.83917631s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 08:59:30.831628932 +0000 UTC m=+872.819640404" watchObservedRunningTime="2026-02-19 08:59:30.83917631 +0000 UTC m=+872.827187782" Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.822484 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" event={"ID":"cb9194bc-2a08-4c61-9302-14d3ab1b731a","Type":"ContainerStarted","Data":"7320aa9d1f6744fa2e6835e5f7f26f28847602b8f000e16a73e06ffd4aba50bb"} Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.823851 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.824122 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" event={"ID":"226ea1a2-f858-46df-8a62-12f5c41da0c5","Type":"ContainerStarted","Data":"68855bbe6082a40a0a3772520ce92481dd9d42523988d0e6d2cc8d6637642640"} Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.824781 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.826459 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" event={"ID":"c4a6e8c2-5708-45ec-8cd7-08d552abbe53","Type":"ContainerStarted","Data":"1deb5c99e0c74f1c5c17de84476df32c52b6bd908d9b65dd44e37e40ced441fd"} Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.826732 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.844609 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" podStartSLOduration=2.70532532 podStartE2EDuration="40.844589863s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.390694061 +0000 UTC m=+837.378705533" lastFinishedPulling="2026-02-19 08:59:33.529958594 +0000 UTC m=+875.517970076" observedRunningTime="2026-02-19 08:59:33.840107987 +0000 UTC m=+875.828119469" watchObservedRunningTime="2026-02-19 08:59:33.844589863 +0000 UTC m=+875.832601335" Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.861403 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" podStartSLOduration=1.854407901 podStartE2EDuration="39.8613739s" podCreationTimestamp="2026-02-19 08:58:54 +0000 UTC" firstStartedPulling="2026-02-19 08:58:55.522938704 +0000 UTC m=+837.510950176" lastFinishedPulling="2026-02-19 08:59:33.529904693 +0000 UTC m=+875.517916175" observedRunningTime="2026-02-19 08:59:33.855669735 +0000 UTC m=+875.843681197" watchObservedRunningTime="2026-02-19 08:59:33.8613739 +0000 UTC m=+875.849385372" Feb 19 08:59:33 crc kubenswrapper[4788]: I0219 08:59:33.883946 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" podStartSLOduration=2.3125968869999998 podStartE2EDuration="40.883914102s" podCreationTimestamp="2026-02-19 08:58:53 +0000 UTC" firstStartedPulling="2026-02-19 08:58:54.95952211 +0000 UTC m=+836.947533582" lastFinishedPulling="2026-02-19 08:59:33.530839325 +0000 UTC m=+875.518850797" observedRunningTime="2026-02-19 08:59:33.879046737 +0000 UTC m=+875.867058219" watchObservedRunningTime="2026-02-19 08:59:33.883914102 +0000 UTC m=+875.871925584" Feb 19 08:59:36 crc kubenswrapper[4788]: I0219 08:59:36.860113 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6774fbc4bc-5gddb" Feb 19 08:59:44 crc kubenswrapper[4788]: I0219 08:59:44.269574 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-8jh9z" Feb 19 08:59:44 crc kubenswrapper[4788]: I0219 08:59:44.431636 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-6zwrs" Feb 19 08:59:44 crc kubenswrapper[4788]: I0219 08:59:44.591795 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-9sr69" Feb 19 08:59:52 crc kubenswrapper[4788]: I0219 08:59:52.139055 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 08:59:52 crc kubenswrapper[4788]: I0219 08:59:52.139501 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.171065 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd"] Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.172653 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.175339 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.175858 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.190062 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd"] Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.326814 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-secret-volume\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.327187 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2fjj\" (UniqueName: \"kubernetes.io/projected/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-kube-api-access-b2fjj\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.327375 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-config-volume\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.428080 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2fjj\" (UniqueName: \"kubernetes.io/projected/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-kube-api-access-b2fjj\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.428123 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-config-volume\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.428235 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-secret-volume\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.429664 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-config-volume\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.436069 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-secret-volume\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.444189 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2fjj\" (UniqueName: \"kubernetes.io/projected/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-kube-api-access-b2fjj\") pod \"collect-profiles-29524860-bvlmd\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.497484 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:00 crc kubenswrapper[4788]: I0219 09:00:00.978777 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd"] Feb 19 09:00:01 crc kubenswrapper[4788]: I0219 09:00:01.044372 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" event={"ID":"0a9b0944-a77d-4a9c-9d2c-f423333e62a0","Type":"ContainerStarted","Data":"d96769d16db9a95e42d4783886d50cbd6046fd96c682aa8738e9aa31bbdd27e3"} Feb 19 09:00:01 crc kubenswrapper[4788]: E0219 09:00:01.406029 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a9b0944_a77d_4a9c_9d2c_f423333e62a0.slice/crio-5395284b6654e0eac4bd84a33d45b6771543388a579b8ea1b996fab350a066ec.scope\": RecentStats: unable to find data in memory cache]" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.056658 4788 generic.go:334] "Generic (PLEG): container finished" podID="0a9b0944-a77d-4a9c-9d2c-f423333e62a0" containerID="5395284b6654e0eac4bd84a33d45b6771543388a579b8ea1b996fab350a066ec" exitCode=0 Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.056733 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" event={"ID":"0a9b0944-a77d-4a9c-9d2c-f423333e62a0","Type":"ContainerDied","Data":"5395284b6654e0eac4bd84a33d45b6771543388a579b8ea1b996fab350a066ec"} Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.491401 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q6sfh"] Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.493379 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.496772 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.496969 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.497031 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-9k5wh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.497153 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.508899 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q6sfh"] Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.550460 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n64lz"] Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.551821 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.554550 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.558452 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n64lz"] Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.659115 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fck8j\" (UniqueName: \"kubernetes.io/projected/dc8c23c7-fd25-4811-8770-fda94392d3c3-kube-api-access-fck8j\") pod \"dnsmasq-dns-675f4bcbfc-q6sfh\" (UID: \"dc8c23c7-fd25-4811-8770-fda94392d3c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.659192 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8c23c7-fd25-4811-8770-fda94392d3c3-config\") pod \"dnsmasq-dns-675f4bcbfc-q6sfh\" (UID: \"dc8c23c7-fd25-4811-8770-fda94392d3c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.659325 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsvwl\" (UniqueName: \"kubernetes.io/projected/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-kube-api-access-wsvwl\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.659434 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.659494 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-config\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.760788 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.760920 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-config\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.760995 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fck8j\" (UniqueName: \"kubernetes.io/projected/dc8c23c7-fd25-4811-8770-fda94392d3c3-kube-api-access-fck8j\") pod \"dnsmasq-dns-675f4bcbfc-q6sfh\" (UID: \"dc8c23c7-fd25-4811-8770-fda94392d3c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.761030 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8c23c7-fd25-4811-8770-fda94392d3c3-config\") pod \"dnsmasq-dns-675f4bcbfc-q6sfh\" (UID: \"dc8c23c7-fd25-4811-8770-fda94392d3c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.761056 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsvwl\" (UniqueName: \"kubernetes.io/projected/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-kube-api-access-wsvwl\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.762028 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8c23c7-fd25-4811-8770-fda94392d3c3-config\") pod \"dnsmasq-dns-675f4bcbfc-q6sfh\" (UID: \"dc8c23c7-fd25-4811-8770-fda94392d3c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.762105 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.762562 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-config\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.779670 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fck8j\" (UniqueName: \"kubernetes.io/projected/dc8c23c7-fd25-4811-8770-fda94392d3c3-kube-api-access-fck8j\") pod \"dnsmasq-dns-675f4bcbfc-q6sfh\" (UID: \"dc8c23c7-fd25-4811-8770-fda94392d3c3\") " pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.779928 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsvwl\" (UniqueName: \"kubernetes.io/projected/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-kube-api-access-wsvwl\") pod \"dnsmasq-dns-78dd6ddcc-n64lz\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.817102 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:02 crc kubenswrapper[4788]: I0219 09:00:02.868651 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.078902 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q6sfh"] Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.290692 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.363118 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n64lz"] Feb 19 09:00:03 crc kubenswrapper[4788]: W0219 09:00:03.364108 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5e0ea28_8223_41dc_91ea_1dfb85dc9fd1.slice/crio-58eb2fd17dcff03b5c1a4f64816fa3c29c8b426d7c2504760531fb43cb338c1e WatchSource:0}: Error finding container 58eb2fd17dcff03b5c1a4f64816fa3c29c8b426d7c2504760531fb43cb338c1e: Status 404 returned error can't find the container with id 58eb2fd17dcff03b5c1a4f64816fa3c29c8b426d7c2504760531fb43cb338c1e Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.477017 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-secret-volume\") pod \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.477092 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2fjj\" (UniqueName: \"kubernetes.io/projected/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-kube-api-access-b2fjj\") pod \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.477145 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-config-volume\") pod \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\" (UID: \"0a9b0944-a77d-4a9c-9d2c-f423333e62a0\") " Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.478123 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "0a9b0944-a77d-4a9c-9d2c-f423333e62a0" (UID: "0a9b0944-a77d-4a9c-9d2c-f423333e62a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.484181 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0a9b0944-a77d-4a9c-9d2c-f423333e62a0" (UID: "0a9b0944-a77d-4a9c-9d2c-f423333e62a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.484461 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-kube-api-access-b2fjj" (OuterVolumeSpecName: "kube-api-access-b2fjj") pod "0a9b0944-a77d-4a9c-9d2c-f423333e62a0" (UID: "0a9b0944-a77d-4a9c-9d2c-f423333e62a0"). InnerVolumeSpecName "kube-api-access-b2fjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.578973 4788 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.579042 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2fjj\" (UniqueName: \"kubernetes.io/projected/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-kube-api-access-b2fjj\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:03 crc kubenswrapper[4788]: I0219 09:00:03.579064 4788 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a9b0944-a77d-4a9c-9d2c-f423333e62a0-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:04 crc kubenswrapper[4788]: I0219 09:00:04.088942 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" Feb 19 09:00:04 crc kubenswrapper[4788]: I0219 09:00:04.088926 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd" event={"ID":"0a9b0944-a77d-4a9c-9d2c-f423333e62a0","Type":"ContainerDied","Data":"d96769d16db9a95e42d4783886d50cbd6046fd96c682aa8738e9aa31bbdd27e3"} Feb 19 09:00:04 crc kubenswrapper[4788]: I0219 09:00:04.089296 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d96769d16db9a95e42d4783886d50cbd6046fd96c682aa8738e9aa31bbdd27e3" Feb 19 09:00:04 crc kubenswrapper[4788]: I0219 09:00:04.091119 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" event={"ID":"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1","Type":"ContainerStarted","Data":"58eb2fd17dcff03b5c1a4f64816fa3c29c8b426d7c2504760531fb43cb338c1e"} Feb 19 09:00:04 crc kubenswrapper[4788]: I0219 09:00:04.093994 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" event={"ID":"dc8c23c7-fd25-4811-8770-fda94392d3c3","Type":"ContainerStarted","Data":"80a12e059d66f2e2881265ffef88d98f21c011635250d53ae56044a4ba6fab6d"} Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.088489 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q6sfh"] Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.112191 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6nknz"] Feb 19 09:00:05 crc kubenswrapper[4788]: E0219 09:00:05.112508 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9b0944-a77d-4a9c-9d2c-f423333e62a0" containerName="collect-profiles" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.112524 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9b0944-a77d-4a9c-9d2c-f423333e62a0" containerName="collect-profiles" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.112696 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9b0944-a77d-4a9c-9d2c-f423333e62a0" containerName="collect-profiles" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.114082 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.124042 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6nknz"] Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.212001 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.212057 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh9kt\" (UniqueName: \"kubernetes.io/projected/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-kube-api-access-rh9kt\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.212075 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-config\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.314071 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.315078 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh9kt\" (UniqueName: \"kubernetes.io/projected/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-kube-api-access-rh9kt\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.315098 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-config\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.315019 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-dns-svc\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.315740 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-config\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.344107 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh9kt\" (UniqueName: \"kubernetes.io/projected/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-kube-api-access-rh9kt\") pod \"dnsmasq-dns-666b6646f7-6nknz\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.362615 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n64lz"] Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.387980 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8lk8v"] Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.390013 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.400125 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8lk8v"] Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.415791 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kgwn\" (UniqueName: \"kubernetes.io/projected/23ee4096-6b61-438e-a3e1-ba9e720abd80-kube-api-access-2kgwn\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.415830 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.415850 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-config\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.440832 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.517118 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kgwn\" (UniqueName: \"kubernetes.io/projected/23ee4096-6b61-438e-a3e1-ba9e720abd80-kube-api-access-2kgwn\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.517183 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.517213 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-config\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.519636 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-config\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.519868 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.535382 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kgwn\" (UniqueName: \"kubernetes.io/projected/23ee4096-6b61-438e-a3e1-ba9e720abd80-kube-api-access-2kgwn\") pod \"dnsmasq-dns-57d769cc4f-8lk8v\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.713156 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:05 crc kubenswrapper[4788]: I0219 09:00:05.904628 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6nknz"] Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.109946 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" event={"ID":"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f","Type":"ContainerStarted","Data":"e792a4b19efbcef82044bf6e7777c9c48cf8ba99550da7f3967f5f890813bfba"} Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.159194 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8lk8v"] Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.261173 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.262887 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.267929 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.268117 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.268189 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.268408 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vlc28" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.268624 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.268751 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.269175 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.286635 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.330602 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.330662 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.330688 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.330732 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad57631d-1772-49f0-ae6b-f16ee556e9c4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.330750 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.330774 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.330918 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad57631d-1772-49f0-ae6b-f16ee556e9c4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.331017 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.331051 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8lgs\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-kube-api-access-c8lgs\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.331082 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.331267 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432500 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432560 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad57631d-1772-49f0-ae6b-f16ee556e9c4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432608 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432637 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8lgs\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-kube-api-access-c8lgs\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432668 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432754 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432789 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432814 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432839 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432866 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad57631d-1772-49f0-ae6b-f16ee556e9c4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.432889 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.433091 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.433725 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.434554 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.434632 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.435145 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.436841 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.438542 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad57631d-1772-49f0-ae6b-f16ee556e9c4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.440261 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad57631d-1772-49f0-ae6b-f16ee556e9c4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.440867 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.460256 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.465841 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.466685 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8lgs\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-kube-api-access-c8lgs\") pod \"rabbitmq-server-0\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.518967 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.520539 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.527868 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.528073 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.528154 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.528345 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.528540 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.528568 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.531865 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2bt5k" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.537293 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.594546 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.634883 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bl7r\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-kube-api-access-6bl7r\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.634926 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.634949 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb0abe11-b278-4a3a-aeda-3e08a603924b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.634964 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.635001 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.635191 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.635320 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.635416 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.635554 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb0abe11-b278-4a3a-aeda-3e08a603924b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.635614 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.635641 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737040 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737097 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737143 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737184 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737225 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb0abe11-b278-4a3a-aeda-3e08a603924b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737272 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737293 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737328 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bl7r\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-kube-api-access-6bl7r\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737358 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737386 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb0abe11-b278-4a3a-aeda-3e08a603924b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737406 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737776 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.737914 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.738026 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.738303 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.738991 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.740203 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.747528 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.753393 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb0abe11-b278-4a3a-aeda-3e08a603924b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.760294 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bl7r\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-kube-api-access-6bl7r\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.760557 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb0abe11-b278-4a3a-aeda-3e08a603924b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.760786 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.769387 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:06 crc kubenswrapper[4788]: I0219 09:00:06.849456 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:00:07 crc kubenswrapper[4788]: I0219 09:00:07.984161 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 19 09:00:07 crc kubenswrapper[4788]: I0219 09:00:07.985439 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 09:00:07 crc kubenswrapper[4788]: I0219 09:00:07.988614 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 19 09:00:07 crc kubenswrapper[4788]: I0219 09:00:07.990257 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 19 09:00:07 crc kubenswrapper[4788]: I0219 09:00:07.990433 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 19 09:00:07 crc kubenswrapper[4788]: I0219 09:00:07.990466 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-n65f4" Feb 19 09:00:07 crc kubenswrapper[4788]: I0219 09:00:07.994267 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 09:00:07 crc kubenswrapper[4788]: I0219 09:00:07.994460 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.160132 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.160183 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.160212 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.160276 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.160300 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-kolla-config\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.160348 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.160381 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvbn4\" (UniqueName: \"kubernetes.io/projected/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-kube-api-access-gvbn4\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.160476 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-config-data-default\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.261708 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.261765 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.261796 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.261839 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.261872 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-kolla-config\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.261916 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.261950 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvbn4\" (UniqueName: \"kubernetes.io/projected/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-kube-api-access-gvbn4\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.261997 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-config-data-default\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.262021 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.263552 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-config-data-default\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.263657 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-kolla-config\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.263987 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.264061 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.267948 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.269812 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.285060 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvbn4\" (UniqueName: \"kubernetes.io/projected/e1e578ea-f70e-4aed-910b-4c1c0ddb3c39-kube-api-access-gvbn4\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.305558 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39\") " pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.316042 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 19 09:00:08 crc kubenswrapper[4788]: I0219 09:00:08.607850 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.145316 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" event={"ID":"23ee4096-6b61-438e-a3e1-ba9e720abd80","Type":"ContainerStarted","Data":"d5ebaa688224516945264c61a8788f393cf41095589e0732da83ddd01dd8142c"} Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.407915 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.410492 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.414639 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.414821 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-d2ct6" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.414860 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.415305 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.421886 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.581746 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.581800 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776314c0-8a5e-4224-8337-d2ae060a7ecd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.581828 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-876dc\" (UniqueName: \"kubernetes.io/projected/776314c0-8a5e-4224-8337-d2ae060a7ecd-kube-api-access-876dc\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.582060 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.582202 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.582275 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/776314c0-8a5e-4224-8337-d2ae060a7ecd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.582561 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.582594 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/776314c0-8a5e-4224-8337-d2ae060a7ecd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.685820 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.685894 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/776314c0-8a5e-4224-8337-d2ae060a7ecd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686008 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686038 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/776314c0-8a5e-4224-8337-d2ae060a7ecd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686666 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/776314c0-8a5e-4224-8337-d2ae060a7ecd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686796 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686065 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686866 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776314c0-8a5e-4224-8337-d2ae060a7ecd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686882 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-876dc\" (UniqueName: \"kubernetes.io/projected/776314c0-8a5e-4224-8337-d2ae060a7ecd-kube-api-access-876dc\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686882 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.686940 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.687189 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.688195 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/776314c0-8a5e-4224-8337-d2ae060a7ecd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.691014 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/776314c0-8a5e-4224-8337-d2ae060a7ecd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.691956 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776314c0-8a5e-4224-8337-d2ae060a7ecd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.711745 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-876dc\" (UniqueName: \"kubernetes.io/projected/776314c0-8a5e-4224-8337-d2ae060a7ecd-kube-api-access-876dc\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.720730 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"776314c0-8a5e-4224-8337-d2ae060a7ecd\") " pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.751143 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.751699 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.752580 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.755961 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.755976 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-n5grf" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.756098 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.765120 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.889866 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/74df8b3a-d0c6-4cb3-b514-a38198179c59-memcached-tls-certs\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.890137 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74df8b3a-d0c6-4cb3-b514-a38198179c59-kolla-config\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.890169 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74df8b3a-d0c6-4cb3-b514-a38198179c59-combined-ca-bundle\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.890192 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74df8b3a-d0c6-4cb3-b514-a38198179c59-config-data\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.890262 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hzb2\" (UniqueName: \"kubernetes.io/projected/74df8b3a-d0c6-4cb3-b514-a38198179c59-kube-api-access-2hzb2\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.954940 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vxzll"] Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.957515 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.964211 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxzll"] Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.991285 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74df8b3a-d0c6-4cb3-b514-a38198179c59-config-data\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.991348 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hzb2\" (UniqueName: \"kubernetes.io/projected/74df8b3a-d0c6-4cb3-b514-a38198179c59-kube-api-access-2hzb2\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.991410 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/74df8b3a-d0c6-4cb3-b514-a38198179c59-memcached-tls-certs\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.991443 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74df8b3a-d0c6-4cb3-b514-a38198179c59-kolla-config\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.991475 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74df8b3a-d0c6-4cb3-b514-a38198179c59-combined-ca-bundle\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.992347 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/74df8b3a-d0c6-4cb3-b514-a38198179c59-config-data\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.995359 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74df8b3a-d0c6-4cb3-b514-a38198179c59-combined-ca-bundle\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.995507 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/74df8b3a-d0c6-4cb3-b514-a38198179c59-kolla-config\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:09 crc kubenswrapper[4788]: I0219 09:00:09.995651 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/74df8b3a-d0c6-4cb3-b514-a38198179c59-memcached-tls-certs\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.013888 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hzb2\" (UniqueName: \"kubernetes.io/projected/74df8b3a-d0c6-4cb3-b514-a38198179c59-kube-api-access-2hzb2\") pod \"memcached-0\" (UID: \"74df8b3a-d0c6-4cb3-b514-a38198179c59\") " pod="openstack/memcached-0" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.093096 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg827\" (UniqueName: \"kubernetes.io/projected/e71e3f37-f071-4790-977c-915af714faf8-kube-api-access-gg827\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.093177 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-utilities\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.093258 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-catalog-content\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.098596 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.194504 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-catalog-content\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.194560 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg827\" (UniqueName: \"kubernetes.io/projected/e71e3f37-f071-4790-977c-915af714faf8-kube-api-access-gg827\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.194612 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-utilities\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.195026 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-utilities\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.195226 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-catalog-content\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.211211 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg827\" (UniqueName: \"kubernetes.io/projected/e71e3f37-f071-4790-977c-915af714faf8-kube-api-access-gg827\") pod \"community-operators-vxzll\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:10 crc kubenswrapper[4788]: I0219 09:00:10.276203 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:11 crc kubenswrapper[4788]: I0219 09:00:11.888285 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:00:11 crc kubenswrapper[4788]: I0219 09:00:11.889504 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 09:00:11 crc kubenswrapper[4788]: I0219 09:00:11.892809 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vbff6" Feb 19 09:00:11 crc kubenswrapper[4788]: I0219 09:00:11.900392 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:00:12 crc kubenswrapper[4788]: I0219 09:00:12.022629 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbtcw\" (UniqueName: \"kubernetes.io/projected/ff06e9e7-8b7d-42d9-b321-172ede793104-kube-api-access-gbtcw\") pod \"kube-state-metrics-0\" (UID: \"ff06e9e7-8b7d-42d9-b321-172ede793104\") " pod="openstack/kube-state-metrics-0" Feb 19 09:00:12 crc kubenswrapper[4788]: I0219 09:00:12.124659 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbtcw\" (UniqueName: \"kubernetes.io/projected/ff06e9e7-8b7d-42d9-b321-172ede793104-kube-api-access-gbtcw\") pod \"kube-state-metrics-0\" (UID: \"ff06e9e7-8b7d-42d9-b321-172ede793104\") " pod="openstack/kube-state-metrics-0" Feb 19 09:00:12 crc kubenswrapper[4788]: I0219 09:00:12.141191 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbtcw\" (UniqueName: \"kubernetes.io/projected/ff06e9e7-8b7d-42d9-b321-172ede793104-kube-api-access-gbtcw\") pod \"kube-state-metrics-0\" (UID: \"ff06e9e7-8b7d-42d9-b321-172ede793104\") " pod="openstack/kube-state-metrics-0" Feb 19 09:00:12 crc kubenswrapper[4788]: I0219 09:00:12.206081 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 09:00:14 crc kubenswrapper[4788]: I0219 09:00:14.775251 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.227152 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-msb2b"] Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.228666 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.237884 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-s2zt9" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.237922 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-msb2b"] Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.238117 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.238322 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.253349 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-snwhx"] Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.275371 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.296201 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-snwhx"] Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382058 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/be836fd0-7c7e-4824-b455-bb4ccec1163e-ovn-controller-tls-certs\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382133 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkbl\" (UniqueName: \"kubernetes.io/projected/be836fd0-7c7e-4824-b455-bb4ccec1163e-kube-api-access-tqkbl\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382336 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be836fd0-7c7e-4824-b455-bb4ccec1163e-combined-ca-bundle\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382403 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-etc-ovs\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382470 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-log-ovn\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382593 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7e5355c-1a77-48de-998c-7d6e676d5eee-scripts\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382677 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-run\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382728 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-lib\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382769 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be836fd0-7c7e-4824-b455-bb4ccec1163e-scripts\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382846 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-run-ovn\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382903 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jxdj\" (UniqueName: \"kubernetes.io/projected/f7e5355c-1a77-48de-998c-7d6e676d5eee-kube-api-access-4jxdj\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382932 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-run\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.382988 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-log\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484227 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/be836fd0-7c7e-4824-b455-bb4ccec1163e-ovn-controller-tls-certs\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484358 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkbl\" (UniqueName: \"kubernetes.io/projected/be836fd0-7c7e-4824-b455-bb4ccec1163e-kube-api-access-tqkbl\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484393 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be836fd0-7c7e-4824-b455-bb4ccec1163e-combined-ca-bundle\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484419 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-etc-ovs\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484448 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-log-ovn\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484480 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7e5355c-1a77-48de-998c-7d6e676d5eee-scripts\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484515 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-run\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484542 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-lib\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484570 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be836fd0-7c7e-4824-b455-bb4ccec1163e-scripts\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.485038 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-etc-ovs\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.485165 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-run\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.485385 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-run-ovn\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.487175 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7e5355c-1a77-48de-998c-7d6e676d5eee-scripts\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.487944 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be836fd0-7c7e-4824-b455-bb4ccec1163e-scripts\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.488141 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-log-ovn\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.488058 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-lib\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.484610 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-run-ovn\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.488617 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jxdj\" (UniqueName: \"kubernetes.io/projected/f7e5355c-1a77-48de-998c-7d6e676d5eee-kube-api-access-4jxdj\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.488749 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-run\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.488913 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-log\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.488880 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/be836fd0-7c7e-4824-b455-bb4ccec1163e-var-run\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.489106 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f7e5355c-1a77-48de-998c-7d6e676d5eee-var-log\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.491984 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/be836fd0-7c7e-4824-b455-bb4ccec1163e-ovn-controller-tls-certs\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.502184 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be836fd0-7c7e-4824-b455-bb4ccec1163e-combined-ca-bundle\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.509748 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jxdj\" (UniqueName: \"kubernetes.io/projected/f7e5355c-1a77-48de-998c-7d6e676d5eee-kube-api-access-4jxdj\") pod \"ovn-controller-ovs-snwhx\" (UID: \"f7e5355c-1a77-48de-998c-7d6e676d5eee\") " pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.518411 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkbl\" (UniqueName: \"kubernetes.io/projected/be836fd0-7c7e-4824-b455-bb4ccec1163e-kube-api-access-tqkbl\") pod \"ovn-controller-msb2b\" (UID: \"be836fd0-7c7e-4824-b455-bb4ccec1163e\") " pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.556782 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b" Feb 19 09:00:15 crc kubenswrapper[4788]: I0219 09:00:15.607862 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.118514 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.120066 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.122869 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.123313 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.123684 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wg5tg" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.131626 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.131717 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.149828 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.199940 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.200014 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.200042 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.200059 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8a6d4e8-cfb3-4949-9901-ca31478fc108-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.200113 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.200138 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8g4\" (UniqueName: \"kubernetes.io/projected/e8a6d4e8-cfb3-4949-9901-ca31478fc108-kube-api-access-4b8g4\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.200208 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8a6d4e8-cfb3-4949-9901-ca31478fc108-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.200270 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a6d4e8-cfb3-4949-9901-ca31478fc108-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302045 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302095 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8g4\" (UniqueName: \"kubernetes.io/projected/e8a6d4e8-cfb3-4949-9901-ca31478fc108-kube-api-access-4b8g4\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302146 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8a6d4e8-cfb3-4949-9901-ca31478fc108-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302175 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a6d4e8-cfb3-4949-9901-ca31478fc108-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302217 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302239 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302278 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302295 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8a6d4e8-cfb3-4949-9901-ca31478fc108-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302925 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8a6d4e8-cfb3-4949-9901-ca31478fc108-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.302959 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.303549 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8a6d4e8-cfb3-4949-9901-ca31478fc108-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.303572 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8a6d4e8-cfb3-4949-9901-ca31478fc108-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.305746 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.311411 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.312367 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8a6d4e8-cfb3-4949-9901-ca31478fc108-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.318980 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8g4\" (UniqueName: \"kubernetes.io/projected/e8a6d4e8-cfb3-4949-9901-ca31478fc108-kube-api-access-4b8g4\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.331180 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8a6d4e8-cfb3-4949-9901-ca31478fc108\") " pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:16 crc kubenswrapper[4788]: I0219 09:00:16.457486 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.396153 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.399624 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.436409 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.436998 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.437124 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-f4mb5" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.441488 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.438553 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.570622 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gsp8\" (UniqueName: \"kubernetes.io/projected/ad13ca9c-744c-4c12-9911-7e84100a1bda-kube-api-access-9gsp8\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.570683 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad13ca9c-744c-4c12-9911-7e84100a1bda-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.570835 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.570892 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad13ca9c-744c-4c12-9911-7e84100a1bda-config\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.571072 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.571167 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.571341 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.571562 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ad13ca9c-744c-4c12-9911-7e84100a1bda-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.673681 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad13ca9c-744c-4c12-9911-7e84100a1bda-config\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.673788 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.673861 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.673925 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.674005 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ad13ca9c-744c-4c12-9911-7e84100a1bda-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.674163 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gsp8\" (UniqueName: \"kubernetes.io/projected/ad13ca9c-744c-4c12-9911-7e84100a1bda-kube-api-access-9gsp8\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.674219 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad13ca9c-744c-4c12-9911-7e84100a1bda-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.674388 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.674645 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ad13ca9c-744c-4c12-9911-7e84100a1bda-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.674665 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.675306 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad13ca9c-744c-4c12-9911-7e84100a1bda-config\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.675446 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ad13ca9c-744c-4c12-9911-7e84100a1bda-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.682728 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.683399 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.684745 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad13ca9c-744c-4c12-9911-7e84100a1bda-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.705436 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gsp8\" (UniqueName: \"kubernetes.io/projected/ad13ca9c-744c-4c12-9911-7e84100a1bda-kube-api-access-9gsp8\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.709297 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ad13ca9c-744c-4c12-9911-7e84100a1bda\") " pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: I0219 09:00:19.757578 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:19 crc kubenswrapper[4788]: W0219 09:00:19.893327 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e578ea_f70e_4aed_910b_4c1c0ddb3c39.slice/crio-0b1fe95f0a3d8106f9024985fdc7e32e1e55322c556f314ad1fc5d085bf17531 WatchSource:0}: Error finding container 0b1fe95f0a3d8106f9024985fdc7e32e1e55322c556f314ad1fc5d085bf17531: Status 404 returned error can't find the container with id 0b1fe95f0a3d8106f9024985fdc7e32e1e55322c556f314ad1fc5d085bf17531 Feb 19 09:00:20 crc kubenswrapper[4788]: I0219 09:00:20.223959 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39","Type":"ContainerStarted","Data":"0b1fe95f0a3d8106f9024985fdc7e32e1e55322c556f314ad1fc5d085bf17531"} Feb 19 09:00:20 crc kubenswrapper[4788]: I0219 09:00:20.301366 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:00:20 crc kubenswrapper[4788]: E0219 09:00:20.819874 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 09:00:20 crc kubenswrapper[4788]: E0219 09:00:20.820599 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsvwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-n64lz_openstack(b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:00:20 crc kubenswrapper[4788]: E0219 09:00:20.822350 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" podUID="b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1" Feb 19 09:00:20 crc kubenswrapper[4788]: E0219 09:00:20.825683 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 19 09:00:20 crc kubenswrapper[4788]: E0219 09:00:20.825906 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fck8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-q6sfh_openstack(dc8c23c7-fd25-4811-8770-fda94392d3c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:00:20 crc kubenswrapper[4788]: E0219 09:00:20.827126 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" podUID="dc8c23c7-fd25-4811-8770-fda94392d3c3" Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.227119 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.232969 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad57631d-1772-49f0-ae6b-f16ee556e9c4","Type":"ContainerStarted","Data":"4d41512668aac09f7d3b537846306904627738615997a0bb5873d6e1afd45ed9"} Feb 19 09:00:21 crc kubenswrapper[4788]: W0219 09:00:21.303173 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb0abe11_b278_4a3a_aeda_3e08a603924b.slice/crio-6c0b6e48330d9271c831ac45b9ffe57b73b71e8d2ea969cf1a926d8aa4075997 WatchSource:0}: Error finding container 6c0b6e48330d9271c831ac45b9ffe57b73b71e8d2ea969cf1a926d8aa4075997: Status 404 returned error can't find the container with id 6c0b6e48330d9271c831ac45b9ffe57b73b71e8d2ea969cf1a926d8aa4075997 Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.381764 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 19 09:00:21 crc kubenswrapper[4788]: W0219 09:00:21.389469 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8a6d4e8_cfb3_4949_9901_ca31478fc108.slice/crio-dc08e09b6ee2fc04b3abda0f6cf428f21ae99820b8a438a29513a626be258398 WatchSource:0}: Error finding container dc08e09b6ee2fc04b3abda0f6cf428f21ae99820b8a438a29513a626be258398: Status 404 returned error can't find the container with id dc08e09b6ee2fc04b3abda0f6cf428f21ae99820b8a438a29513a626be258398 Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.754206 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.755240 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.755363 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.765154 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.795271 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.822274 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxzll"] Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.886793 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.937695 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsvwl\" (UniqueName: \"kubernetes.io/projected/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-kube-api-access-wsvwl\") pod \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.937821 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-dns-svc\") pod \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.937844 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fck8j\" (UniqueName: \"kubernetes.io/projected/dc8c23c7-fd25-4811-8770-fda94392d3c3-kube-api-access-fck8j\") pod \"dc8c23c7-fd25-4811-8770-fda94392d3c3\" (UID: \"dc8c23c7-fd25-4811-8770-fda94392d3c3\") " Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.937861 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8c23c7-fd25-4811-8770-fda94392d3c3-config\") pod \"dc8c23c7-fd25-4811-8770-fda94392d3c3\" (UID: \"dc8c23c7-fd25-4811-8770-fda94392d3c3\") " Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.937939 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-config\") pod \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\" (UID: \"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1\") " Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.942727 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1" (UID: "b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.943838 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-config" (OuterVolumeSpecName: "config") pod "b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1" (UID: "b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.944377 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc8c23c7-fd25-4811-8770-fda94392d3c3-config" (OuterVolumeSpecName: "config") pod "dc8c23c7-fd25-4811-8770-fda94392d3c3" (UID: "dc8c23c7-fd25-4811-8770-fda94392d3c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.967670 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-kube-api-access-wsvwl" (OuterVolumeSpecName: "kube-api-access-wsvwl") pod "b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1" (UID: "b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1"). InnerVolumeSpecName "kube-api-access-wsvwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:21 crc kubenswrapper[4788]: I0219 09:00:21.974200 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc8c23c7-fd25-4811-8770-fda94392d3c3-kube-api-access-fck8j" (OuterVolumeSpecName: "kube-api-access-fck8j") pod "dc8c23c7-fd25-4811-8770-fda94392d3c3" (UID: "dc8c23c7-fd25-4811-8770-fda94392d3c3"). InnerVolumeSpecName "kube-api-access-fck8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.042010 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsvwl\" (UniqueName: \"kubernetes.io/projected/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-kube-api-access-wsvwl\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.042065 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.042075 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fck8j\" (UniqueName: \"kubernetes.io/projected/dc8c23c7-fd25-4811-8770-fda94392d3c3-kube-api-access-fck8j\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.042085 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc8c23c7-fd25-4811-8770-fda94392d3c3-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.042093 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.047186 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-msb2b"] Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.132205 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-snwhx"] Feb 19 09:00:22 crc kubenswrapper[4788]: W0219 09:00:22.136479 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e5355c_1a77_48de_998c_7d6e676d5eee.slice/crio-f08d436776554c1d251394544f603cde10cee182faf4192372d53a4a3d7cce48 WatchSource:0}: Error finding container f08d436776554c1d251394544f603cde10cee182faf4192372d53a4a3d7cce48: Status 404 returned error can't find the container with id f08d436776554c1d251394544f603cde10cee182faf4192372d53a4a3d7cce48 Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.139024 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.139084 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.242736 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-snwhx" event={"ID":"f7e5355c-1a77-48de-998c-7d6e676d5eee","Type":"ContainerStarted","Data":"f08d436776554c1d251394544f603cde10cee182faf4192372d53a4a3d7cce48"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.245585 4788 generic.go:334] "Generic (PLEG): container finished" podID="23ee4096-6b61-438e-a3e1-ba9e720abd80" containerID="0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5" exitCode=0 Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.245820 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" event={"ID":"23ee4096-6b61-438e-a3e1-ba9e720abd80","Type":"ContainerDied","Data":"0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.247193 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ff06e9e7-8b7d-42d9-b321-172ede793104","Type":"ContainerStarted","Data":"82976de3db0829358404db5db709f36d828027a92aab9593161896223fd3a387"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.248730 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-msb2b" event={"ID":"be836fd0-7c7e-4824-b455-bb4ccec1163e","Type":"ContainerStarted","Data":"0945c6c2291818f4291d4f9d438981e62a3ffc528095b9ef64e66884cef8f9f3"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.249674 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" event={"ID":"b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1","Type":"ContainerDied","Data":"58eb2fd17dcff03b5c1a4f64816fa3c29c8b426d7c2504760531fb43cb338c1e"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.249685 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-n64lz" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.250371 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"74df8b3a-d0c6-4cb3-b514-a38198179c59","Type":"ContainerStarted","Data":"25b7e7712bdce78f3e029d85665bf3b3e88b1e5a876ff99924e080e6575281dd"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.253126 4788 generic.go:334] "Generic (PLEG): container finished" podID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" containerID="f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8" exitCode=0 Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.253185 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" event={"ID":"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f","Type":"ContainerDied","Data":"f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.259894 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"776314c0-8a5e-4224-8337-d2ae060a7ecd","Type":"ContainerStarted","Data":"9cf7edcad90915869c2b6bdfcdc2f70836fdb93cf0ffa0cae7d781e9a1398596"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.267147 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ad13ca9c-744c-4c12-9911-7e84100a1bda","Type":"ContainerStarted","Data":"43dc502eeb299cf1b7303a30b004017ab5d1c007d94f98e1feda75886ab28d2d"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.271271 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" event={"ID":"dc8c23c7-fd25-4811-8770-fda94392d3c3","Type":"ContainerDied","Data":"80a12e059d66f2e2881265ffef88d98f21c011635250d53ae56044a4ba6fab6d"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.271370 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-q6sfh" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.273789 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8a6d4e8-cfb3-4949-9901-ca31478fc108","Type":"ContainerStarted","Data":"dc08e09b6ee2fc04b3abda0f6cf428f21ae99820b8a438a29513a626be258398"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.276916 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb0abe11-b278-4a3a-aeda-3e08a603924b","Type":"ContainerStarted","Data":"6c0b6e48330d9271c831ac45b9ffe57b73b71e8d2ea969cf1a926d8aa4075997"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.279183 4788 generic.go:334] "Generic (PLEG): container finished" podID="e71e3f37-f071-4790-977c-915af714faf8" containerID="14139638b12f921d11e81828b9b1bb8df1ec2c0a469bbb53a52a7be82ec67dd4" exitCode=0 Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.279239 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxzll" event={"ID":"e71e3f37-f071-4790-977c-915af714faf8","Type":"ContainerDied","Data":"14139638b12f921d11e81828b9b1bb8df1ec2c0a469bbb53a52a7be82ec67dd4"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.279379 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxzll" event={"ID":"e71e3f37-f071-4790-977c-915af714faf8","Type":"ContainerStarted","Data":"cb7f4d07d81b9fb8eec72fa8b75130880feae6b18d4a831114789881d0aef69b"} Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.354993 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q6sfh"] Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.361519 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-q6sfh"] Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.372182 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n64lz"] Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.376939 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-n64lz"] Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.727539 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1" path="/var/lib/kubelet/pods/b5e0ea28-8223-41dc-91ea-1dfb85dc9fd1/volumes" Feb 19 09:00:22 crc kubenswrapper[4788]: I0219 09:00:22.727998 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc8c23c7-fd25-4811-8770-fda94392d3c3" path="/var/lib/kubelet/pods/dc8c23c7-fd25-4811-8770-fda94392d3c3/volumes" Feb 19 09:00:23 crc kubenswrapper[4788]: I0219 09:00:23.290295 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" event={"ID":"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f","Type":"ContainerStarted","Data":"bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227"} Feb 19 09:00:23 crc kubenswrapper[4788]: I0219 09:00:23.290510 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:23 crc kubenswrapper[4788]: I0219 09:00:23.293478 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" event={"ID":"23ee4096-6b61-438e-a3e1-ba9e720abd80","Type":"ContainerStarted","Data":"143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d"} Feb 19 09:00:23 crc kubenswrapper[4788]: I0219 09:00:23.293609 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:23 crc kubenswrapper[4788]: I0219 09:00:23.306065 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" podStartSLOduration=3.226609677 podStartE2EDuration="18.306048535s" podCreationTimestamp="2026-02-19 09:00:05 +0000 UTC" firstStartedPulling="2026-02-19 09:00:05.920756466 +0000 UTC m=+907.908767928" lastFinishedPulling="2026-02-19 09:00:21.000195304 +0000 UTC m=+922.988206786" observedRunningTime="2026-02-19 09:00:23.304104006 +0000 UTC m=+925.292115478" watchObservedRunningTime="2026-02-19 09:00:23.306048535 +0000 UTC m=+925.294060007" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.446417 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" podStartSLOduration=8.047317793 podStartE2EDuration="20.446397828s" podCreationTimestamp="2026-02-19 09:00:05 +0000 UTC" firstStartedPulling="2026-02-19 09:00:08.607599953 +0000 UTC m=+910.595611425" lastFinishedPulling="2026-02-19 09:00:21.006679988 +0000 UTC m=+922.994691460" observedRunningTime="2026-02-19 09:00:23.335540592 +0000 UTC m=+925.323552064" watchObservedRunningTime="2026-02-19 09:00:25.446397828 +0000 UTC m=+927.434409300" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.454967 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9l9sr"] Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.456500 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.470976 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9l9sr"] Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.605660 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9z6c\" (UniqueName: \"kubernetes.io/projected/e2f5fb48-2371-47b0-be72-9521d37955bc-kube-api-access-j9z6c\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.605757 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-catalog-content\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.605863 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-utilities\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.706887 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-utilities\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.706987 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9z6c\" (UniqueName: \"kubernetes.io/projected/e2f5fb48-2371-47b0-be72-9521d37955bc-kube-api-access-j9z6c\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.707008 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-catalog-content\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.707516 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-catalog-content\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.707732 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-utilities\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.743575 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9z6c\" (UniqueName: \"kubernetes.io/projected/e2f5fb48-2371-47b0-be72-9521d37955bc-kube-api-access-j9z6c\") pod \"certified-operators-9l9sr\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:25 crc kubenswrapper[4788]: I0219 09:00:25.779712 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:29 crc kubenswrapper[4788]: I0219 09:00:29.851128 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9l9sr"] Feb 19 09:00:30 crc kubenswrapper[4788]: I0219 09:00:30.389571 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9sr" event={"ID":"e2f5fb48-2371-47b0-be72-9521d37955bc","Type":"ContainerStarted","Data":"c438969f7705c765c34b80811523a8a21910a2b7907d59bfd1eca688d6e23682"} Feb 19 09:00:30 crc kubenswrapper[4788]: I0219 09:00:30.442408 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:30 crc kubenswrapper[4788]: I0219 09:00:30.731808 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:30 crc kubenswrapper[4788]: I0219 09:00:30.839691 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6nknz"] Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.399495 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ad13ca9c-744c-4c12-9911-7e84100a1bda","Type":"ContainerStarted","Data":"faaec36def72b447d48410211c9ee2495011a483f3290fa32920d3f1e34de592"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.401102 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39","Type":"ContainerStarted","Data":"5cbdf2635d6b21077f5f1671f0d756c8c5e89eca484d70668f1ae081b928c103"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.402398 4788 generic.go:334] "Generic (PLEG): container finished" podID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerID="a3f9eb95bd8f928b95b5e34cd98b9b17c41dc3419a949d2048889d57f5bc7fc8" exitCode=0 Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.402468 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9sr" event={"ID":"e2f5fb48-2371-47b0-be72-9521d37955bc","Type":"ContainerDied","Data":"a3f9eb95bd8f928b95b5e34cd98b9b17c41dc3419a949d2048889d57f5bc7fc8"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.403959 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-snwhx" event={"ID":"f7e5355c-1a77-48de-998c-7d6e676d5eee","Type":"ContainerStarted","Data":"6bcd7dbe2715f99b08a7b665d6b79a963618d3a8c00d8add5fc5fec72bb421ec"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.406006 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"776314c0-8a5e-4224-8337-d2ae060a7ecd","Type":"ContainerStarted","Data":"d2545f60ae3051223beb687bb4ba157034390630fbce6a270d926ca41bffece3"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.407849 4788 generic.go:334] "Generic (PLEG): container finished" podID="e71e3f37-f071-4790-977c-915af714faf8" containerID="ee954c1748690c0ec9ad95d4b7cad546a42b42dc0cebfd7025df44eeb6acfc0f" exitCode=0 Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.407886 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxzll" event={"ID":"e71e3f37-f071-4790-977c-915af714faf8","Type":"ContainerDied","Data":"ee954c1748690c0ec9ad95d4b7cad546a42b42dc0cebfd7025df44eeb6acfc0f"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.409504 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ff06e9e7-8b7d-42d9-b321-172ede793104","Type":"ContainerStarted","Data":"553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.409713 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.410946 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8a6d4e8-cfb3-4949-9901-ca31478fc108","Type":"ContainerStarted","Data":"453f084b2ff7b979557f9f7082266a7af273d255a2e8568b9e02d4ef3c5d9958"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.413023 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"74df8b3a-d0c6-4cb3-b514-a38198179c59","Type":"ContainerStarted","Data":"e00bab268abe135f14fef6c3ab5b58609b07603e5836aba1b08b230d9f73e11b"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.414238 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.415624 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad57631d-1772-49f0-ae6b-f16ee556e9c4","Type":"ContainerStarted","Data":"deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.417325 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" podUID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" containerName="dnsmasq-dns" containerID="cri-o://bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227" gracePeriod=10 Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.417974 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-msb2b" event={"ID":"be836fd0-7c7e-4824-b455-bb4ccec1163e","Type":"ContainerStarted","Data":"5ddd99b78db93002f29194edb1cf859be2e637d7d1e7cb4a8026751fa5b6b4a3"} Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.418004 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-msb2b" Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.448907 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-msb2b" podStartSLOduration=8.979414942 podStartE2EDuration="16.448882867s" podCreationTimestamp="2026-02-19 09:00:15 +0000 UTC" firstStartedPulling="2026-02-19 09:00:22.063084521 +0000 UTC m=+924.051095993" lastFinishedPulling="2026-02-19 09:00:29.532552406 +0000 UTC m=+931.520563918" observedRunningTime="2026-02-19 09:00:31.442927006 +0000 UTC m=+933.430938498" watchObservedRunningTime="2026-02-19 09:00:31.448882867 +0000 UTC m=+933.436894339" Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.496148 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.671727647 podStartE2EDuration="22.496127453s" podCreationTimestamp="2026-02-19 09:00:09 +0000 UTC" firstStartedPulling="2026-02-19 09:00:21.794898932 +0000 UTC m=+923.782910404" lastFinishedPulling="2026-02-19 09:00:28.619298728 +0000 UTC m=+930.607310210" observedRunningTime="2026-02-19 09:00:31.485967655 +0000 UTC m=+933.473979137" watchObservedRunningTime="2026-02-19 09:00:31.496127453 +0000 UTC m=+933.484138925" Feb 19 09:00:31 crc kubenswrapper[4788]: I0219 09:00:31.523739 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.812183964 podStartE2EDuration="20.523719801s" podCreationTimestamp="2026-02-19 09:00:11 +0000 UTC" firstStartedPulling="2026-02-19 09:00:21.874406515 +0000 UTC m=+923.862417987" lastFinishedPulling="2026-02-19 09:00:30.585942342 +0000 UTC m=+932.573953824" observedRunningTime="2026-02-19 09:00:31.520742696 +0000 UTC m=+933.508754168" watchObservedRunningTime="2026-02-19 09:00:31.523719801 +0000 UTC m=+933.511731283" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.002156 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.118038 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-dns-svc\") pod \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.118604 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh9kt\" (UniqueName: \"kubernetes.io/projected/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-kube-api-access-rh9kt\") pod \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.118689 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-config\") pod \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\" (UID: \"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f\") " Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.134758 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-kube-api-access-rh9kt" (OuterVolumeSpecName: "kube-api-access-rh9kt") pod "6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" (UID: "6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f"). InnerVolumeSpecName "kube-api-access-rh9kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.178876 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" (UID: "6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.186288 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-config" (OuterVolumeSpecName: "config") pod "6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" (UID: "6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.220853 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh9kt\" (UniqueName: \"kubernetes.io/projected/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-kube-api-access-rh9kt\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.220886 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.220897 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.425772 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb0abe11-b278-4a3a-aeda-3e08a603924b","Type":"ContainerStarted","Data":"60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5"} Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.429878 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9sr" event={"ID":"e2f5fb48-2371-47b0-be72-9521d37955bc","Type":"ContainerStarted","Data":"3c7bf5b956189bc39d2dc49af684ca8e7b1d9f8526c2fcf96f94799c563cb4c9"} Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.433047 4788 generic.go:334] "Generic (PLEG): container finished" podID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" containerID="bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227" exitCode=0 Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.433145 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" event={"ID":"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f","Type":"ContainerDied","Data":"bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227"} Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.433179 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" event={"ID":"6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f","Type":"ContainerDied","Data":"e792a4b19efbcef82044bf6e7777c9c48cf8ba99550da7f3967f5f890813bfba"} Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.433173 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-6nknz" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.433196 4788 scope.go:117] "RemoveContainer" containerID="bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.435236 4788 generic.go:334] "Generic (PLEG): container finished" podID="f7e5355c-1a77-48de-998c-7d6e676d5eee" containerID="6bcd7dbe2715f99b08a7b665d6b79a963618d3a8c00d8add5fc5fec72bb421ec" exitCode=0 Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.435511 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-snwhx" event={"ID":"f7e5355c-1a77-48de-998c-7d6e676d5eee","Type":"ContainerDied","Data":"6bcd7dbe2715f99b08a7b665d6b79a963618d3a8c00d8add5fc5fec72bb421ec"} Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.443914 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxzll" event={"ID":"e71e3f37-f071-4790-977c-915af714faf8","Type":"ContainerStarted","Data":"d5b736388630ef01dd35f5ee64b984608de37ff930e6cc9a8b72106a587ce63b"} Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.520666 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vxzll" podStartSLOduration=13.929898864 podStartE2EDuration="23.520651968s" podCreationTimestamp="2026-02-19 09:00:09 +0000 UTC" firstStartedPulling="2026-02-19 09:00:22.304948574 +0000 UTC m=+924.292960046" lastFinishedPulling="2026-02-19 09:00:31.895701678 +0000 UTC m=+933.883713150" observedRunningTime="2026-02-19 09:00:32.519516659 +0000 UTC m=+934.507528141" watchObservedRunningTime="2026-02-19 09:00:32.520651968 +0000 UTC m=+934.508663440" Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.535729 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6nknz"] Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.546165 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-6nknz"] Feb 19 09:00:32 crc kubenswrapper[4788]: I0219 09:00:32.723856 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" path="/var/lib/kubelet/pods/6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f/volumes" Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.058496 4788 scope.go:117] "RemoveContainer" containerID="f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8" Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.078440 4788 scope.go:117] "RemoveContainer" containerID="bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227" Feb 19 09:00:33 crc kubenswrapper[4788]: E0219 09:00:33.078829 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227\": container with ID starting with bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227 not found: ID does not exist" containerID="bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227" Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.078851 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227"} err="failed to get container status \"bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227\": rpc error: code = NotFound desc = could not find container \"bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227\": container with ID starting with bf0c9c8bc296eda3790ad1a02fb13151825b0dfd7b4a79f0698f061cc34ce227 not found: ID does not exist" Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.078870 4788 scope.go:117] "RemoveContainer" containerID="f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8" Feb 19 09:00:33 crc kubenswrapper[4788]: E0219 09:00:33.079453 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8\": container with ID starting with f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8 not found: ID does not exist" containerID="f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8" Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.079472 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8"} err="failed to get container status \"f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8\": rpc error: code = NotFound desc = could not find container \"f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8\": container with ID starting with f868ad2689db76bed66b08ca46a02097d30d0fa2481678927a9187b8af7d7fb8 not found: ID does not exist" Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.451717 4788 generic.go:334] "Generic (PLEG): container finished" podID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerID="3c7bf5b956189bc39d2dc49af684ca8e7b1d9f8526c2fcf96f94799c563cb4c9" exitCode=0 Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.451758 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9sr" event={"ID":"e2f5fb48-2371-47b0-be72-9521d37955bc","Type":"ContainerDied","Data":"3c7bf5b956189bc39d2dc49af684ca8e7b1d9f8526c2fcf96f94799c563cb4c9"} Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.455997 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-snwhx" event={"ID":"f7e5355c-1a77-48de-998c-7d6e676d5eee","Type":"ContainerStarted","Data":"07fb02a89fe86f678bcef126b29a2b777ea7e88f077905f2d04c1f58e0d93d00"} Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.458019 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ad13ca9c-744c-4c12-9911-7e84100a1bda","Type":"ContainerStarted","Data":"782db906aa774ffe110742974275fbe46cdda2ff49f3339ad5cd84d4fdb393b7"} Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.462776 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8a6d4e8-cfb3-4949-9901-ca31478fc108","Type":"ContainerStarted","Data":"aa3368ff8cf07adc2d55369a1c8c7d128db9ad17f5238a3d496404d633fb3143"} Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.489985 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.745807109 podStartE2EDuration="18.489966366s" podCreationTimestamp="2026-02-19 09:00:15 +0000 UTC" firstStartedPulling="2026-02-19 09:00:21.3919102 +0000 UTC m=+923.379921672" lastFinishedPulling="2026-02-19 09:00:33.136069457 +0000 UTC m=+935.124080929" observedRunningTime="2026-02-19 09:00:33.484458826 +0000 UTC m=+935.472470298" watchObservedRunningTime="2026-02-19 09:00:33.489966366 +0000 UTC m=+935.477977838" Feb 19 09:00:33 crc kubenswrapper[4788]: I0219 09:00:33.517106 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.374253808 podStartE2EDuration="15.517087082s" podCreationTimestamp="2026-02-19 09:00:18 +0000 UTC" firstStartedPulling="2026-02-19 09:00:21.984468661 +0000 UTC m=+923.972480133" lastFinishedPulling="2026-02-19 09:00:33.127301935 +0000 UTC m=+935.115313407" observedRunningTime="2026-02-19 09:00:33.510761222 +0000 UTC m=+935.498772704" watchObservedRunningTime="2026-02-19 09:00:33.517087082 +0000 UTC m=+935.505098554" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.458755 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.473617 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-snwhx" event={"ID":"f7e5355c-1a77-48de-998c-7d6e676d5eee","Type":"ContainerStarted","Data":"fa5580ac15c026ac977f87ae1fa333a89fa33406e99710615bc5e187548b5dcb"} Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.473806 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.473851 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.477203 4788 generic.go:334] "Generic (PLEG): container finished" podID="776314c0-8a5e-4224-8337-d2ae060a7ecd" containerID="d2545f60ae3051223beb687bb4ba157034390630fbce6a270d926ca41bffece3" exitCode=0 Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.477274 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"776314c0-8a5e-4224-8337-d2ae060a7ecd","Type":"ContainerDied","Data":"d2545f60ae3051223beb687bb4ba157034390630fbce6a270d926ca41bffece3"} Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.483968 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9sr" event={"ID":"e2f5fb48-2371-47b0-be72-9521d37955bc","Type":"ContainerStarted","Data":"ef989681e48ec05d6020c8ee3961f4d477857a509033ba8dc90733e6c0d7d9ae"} Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.513647 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.536750 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-snwhx" podStartSLOduration=12.154865446 podStartE2EDuration="19.536727524s" podCreationTimestamp="2026-02-19 09:00:15 +0000 UTC" firstStartedPulling="2026-02-19 09:00:22.13848867 +0000 UTC m=+924.126500142" lastFinishedPulling="2026-02-19 09:00:29.520350727 +0000 UTC m=+931.508362220" observedRunningTime="2026-02-19 09:00:34.510603763 +0000 UTC m=+936.498615255" watchObservedRunningTime="2026-02-19 09:00:34.536727524 +0000 UTC m=+936.524739006" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.544700 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9l9sr" podStartSLOduration=7.130727577 podStartE2EDuration="9.544679165s" podCreationTimestamp="2026-02-19 09:00:25 +0000 UTC" firstStartedPulling="2026-02-19 09:00:31.403756794 +0000 UTC m=+933.391768276" lastFinishedPulling="2026-02-19 09:00:33.817708382 +0000 UTC m=+935.805719864" observedRunningTime="2026-02-19 09:00:34.532935248 +0000 UTC m=+936.520946720" watchObservedRunningTime="2026-02-19 09:00:34.544679165 +0000 UTC m=+936.532690647" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.758211 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.758272 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:34 crc kubenswrapper[4788]: I0219 09:00:34.798022 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.100473 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.490521 4788 generic.go:334] "Generic (PLEG): container finished" podID="e1e578ea-f70e-4aed-910b-4c1c0ddb3c39" containerID="5cbdf2635d6b21077f5f1671f0d756c8c5e89eca484d70668f1ae081b928c103" exitCode=0 Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.490659 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39","Type":"ContainerDied","Data":"5cbdf2635d6b21077f5f1671f0d756c8c5e89eca484d70668f1ae081b928c103"} Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.493380 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"776314c0-8a5e-4224-8337-d2ae060a7ecd","Type":"ContainerStarted","Data":"ea2f3f86080b201c5fa0a8fdb72cb469cc5234bec1c0288b423300269fbe10f1"} Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.494204 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.543169 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.922918469 podStartE2EDuration="27.543151081s" podCreationTimestamp="2026-02-19 09:00:08 +0000 UTC" firstStartedPulling="2026-02-19 09:00:21.835305935 +0000 UTC m=+923.823317407" lastFinishedPulling="2026-02-19 09:00:29.455538537 +0000 UTC m=+931.443550019" observedRunningTime="2026-02-19 09:00:35.541671694 +0000 UTC m=+937.529683176" watchObservedRunningTime="2026-02-19 09:00:35.543151081 +0000 UTC m=+937.531162553" Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.780959 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.781022 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:35 crc kubenswrapper[4788]: I0219 09:00:35.826616 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.330678 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nc4p7"] Feb 19 09:00:36 crc kubenswrapper[4788]: E0219 09:00:36.331102 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" containerName="init" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.331128 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" containerName="init" Feb 19 09:00:36 crc kubenswrapper[4788]: E0219 09:00:36.331164 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" containerName="dnsmasq-dns" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.331176 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" containerName="dnsmasq-dns" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.331392 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8c9d7a-e39b-4a58-a8bd-4cba6b94342f" containerName="dnsmasq-dns" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.332736 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.351144 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc4p7"] Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.485790 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhzf2\" (UniqueName: \"kubernetes.io/projected/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-kube-api-access-fhzf2\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.485865 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-catalog-content\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.485909 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-utilities\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.500347 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.504370 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e1e578ea-f70e-4aed-910b-4c1c0ddb3c39","Type":"ContainerStarted","Data":"0b5a6cc5e16948adc2be80e877a81cc7f37f8269d2df025a35d63ad5794f65a0"} Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.551046 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.575800 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.878211993 podStartE2EDuration="30.575780382s" podCreationTimestamp="2026-02-19 09:00:06 +0000 UTC" firstStartedPulling="2026-02-19 09:00:19.918863601 +0000 UTC m=+921.906875113" lastFinishedPulling="2026-02-19 09:00:29.61643201 +0000 UTC m=+931.604443502" observedRunningTime="2026-02-19 09:00:36.547693161 +0000 UTC m=+938.535704643" watchObservedRunningTime="2026-02-19 09:00:36.575780382 +0000 UTC m=+938.563791854" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.587978 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhzf2\" (UniqueName: \"kubernetes.io/projected/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-kube-api-access-fhzf2\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.588037 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-catalog-content\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.588081 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-utilities\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.588589 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-utilities\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.596432 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-catalog-content\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.613520 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhzf2\" (UniqueName: \"kubernetes.io/projected/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-kube-api-access-fhzf2\") pod \"redhat-operators-nc4p7\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.672414 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.676063 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cxxn7"] Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.683689 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.687703 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.738011 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cxxn7"] Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.755315 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-vdj96"] Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.756390 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.758632 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.815567 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vdj96"] Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821733 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njjsn\" (UniqueName: \"kubernetes.io/projected/3e764201-18a2-418f-88a7-94009527cfb7-kube-api-access-njjsn\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821793 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821827 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgtnd\" (UniqueName: \"kubernetes.io/projected/c8a9b729-65f6-40d1-94d6-149133edae05-kube-api-access-pgtnd\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821865 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a9b729-65f6-40d1-94d6-149133edae05-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821884 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821900 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c8a9b729-65f6-40d1-94d6-149133edae05-ovs-rundir\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821935 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-config\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821956 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c8a9b729-65f6-40d1-94d6-149133edae05-ovn-rundir\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.821992 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a9b729-65f6-40d1-94d6-149133edae05-config\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.822005 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a9b729-65f6-40d1-94d6-149133edae05-combined-ca-bundle\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.856859 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.858314 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.861741 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.861947 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-cgnpw" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.861742 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.862290 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.879621 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.893179 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cxxn7"] Feb 19 09:00:36 crc kubenswrapper[4788]: E0219 09:00:36.893860 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-njjsn ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" podUID="3e764201-18a2-418f-88a7-94009527cfb7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.903836 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-whwsd"] Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.913584 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.917555 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924056 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a9b729-65f6-40d1-94d6-149133edae05-config\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924100 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a9b729-65f6-40d1-94d6-149133edae05-combined-ca-bundle\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924158 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njjsn\" (UniqueName: \"kubernetes.io/projected/3e764201-18a2-418f-88a7-94009527cfb7-kube-api-access-njjsn\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924185 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924213 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c371e63d-7b67-4bcf-a08d-292f3743c388-scripts\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924260 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgtnd\" (UniqueName: \"kubernetes.io/projected/c8a9b729-65f6-40d1-94d6-149133edae05-kube-api-access-pgtnd\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924280 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c371e63d-7b67-4bcf-a08d-292f3743c388-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924300 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924329 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9tqj\" (UniqueName: \"kubernetes.io/projected/c371e63d-7b67-4bcf-a08d-292f3743c388-kube-api-access-m9tqj\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924352 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a9b729-65f6-40d1-94d6-149133edae05-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924370 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924389 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c8a9b729-65f6-40d1-94d6-149133edae05-ovs-rundir\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924427 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-config\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924634 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924656 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c8a9b729-65f6-40d1-94d6-149133edae05-ovn-rundir\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924672 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c371e63d-7b67-4bcf-a08d-292f3743c388-config\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.924696 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.925341 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a9b729-65f6-40d1-94d6-149133edae05-config\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.925542 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c8a9b729-65f6-40d1-94d6-149133edae05-ovn-rundir\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.926144 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-config\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.926231 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c8a9b729-65f6-40d1-94d6-149133edae05-ovs-rundir\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.927225 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.927655 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.945029 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8a9b729-65f6-40d1-94d6-149133edae05-combined-ca-bundle\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.945765 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8a9b729-65f6-40d1-94d6-149133edae05-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.949565 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njjsn\" (UniqueName: \"kubernetes.io/projected/3e764201-18a2-418f-88a7-94009527cfb7-kube-api-access-njjsn\") pod \"dnsmasq-dns-7fd796d7df-cxxn7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.956130 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgtnd\" (UniqueName: \"kubernetes.io/projected/c8a9b729-65f6-40d1-94d6-149133edae05-kube-api-access-pgtnd\") pod \"ovn-controller-metrics-vdj96\" (UID: \"c8a9b729-65f6-40d1-94d6-149133edae05\") " pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:36 crc kubenswrapper[4788]: I0219 09:00:36.959746 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-whwsd"] Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027744 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027793 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c371e63d-7b67-4bcf-a08d-292f3743c388-scripts\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027819 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5kj9\" (UniqueName: \"kubernetes.io/projected/df0da7e1-775f-413a-a2fc-3ee4232048e0-kube-api-access-q5kj9\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027842 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c371e63d-7b67-4bcf-a08d-292f3743c388-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027861 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027882 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027903 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9tqj\" (UniqueName: \"kubernetes.io/projected/c371e63d-7b67-4bcf-a08d-292f3743c388-kube-api-access-m9tqj\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027950 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027970 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c371e63d-7b67-4bcf-a08d-292f3743c388-config\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.027987 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-config\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.028007 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.028046 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.028458 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c371e63d-7b67-4bcf-a08d-292f3743c388-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.028615 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c371e63d-7b67-4bcf-a08d-292f3743c388-scripts\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.029100 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c371e63d-7b67-4bcf-a08d-292f3743c388-config\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.033829 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.034691 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.042859 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c371e63d-7b67-4bcf-a08d-292f3743c388-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.045967 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9tqj\" (UniqueName: \"kubernetes.io/projected/c371e63d-7b67-4bcf-a08d-292f3743c388-kube-api-access-m9tqj\") pod \"ovn-northd-0\" (UID: \"c371e63d-7b67-4bcf-a08d-292f3743c388\") " pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.129421 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.129472 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.129499 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5kj9\" (UniqueName: \"kubernetes.io/projected/df0da7e1-775f-413a-a2fc-3ee4232048e0-kube-api-access-q5kj9\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.129522 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.129580 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-config\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.131164 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.131234 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.131281 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-config\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.131612 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.142688 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vdj96" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.158388 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5kj9\" (UniqueName: \"kubernetes.io/projected/df0da7e1-775f-413a-a2fc-3ee4232048e0-kube-api-access-q5kj9\") pod \"dnsmasq-dns-86db49b7ff-whwsd\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.193613 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.319685 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc4p7"] Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.327947 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.535191 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.536076 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4p7" event={"ID":"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3","Type":"ContainerStarted","Data":"a4b9b6ff31af3047adf90ec5aad004b52a3973cb641bd1492ddaa97ddedaadf2"} Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.581887 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:37 crc kubenswrapper[4788]: E0219 09:00:37.599120 4788 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.169:43094->38.102.83.169:44647: write tcp 38.102.83.169:43094->38.102.83.169:44647: write: broken pipe Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.635767 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-ovsdbserver-nb\") pod \"3e764201-18a2-418f-88a7-94009527cfb7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.635891 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-config\") pod \"3e764201-18a2-418f-88a7-94009527cfb7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.635965 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njjsn\" (UniqueName: \"kubernetes.io/projected/3e764201-18a2-418f-88a7-94009527cfb7-kube-api-access-njjsn\") pod \"3e764201-18a2-418f-88a7-94009527cfb7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.636037 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-dns-svc\") pod \"3e764201-18a2-418f-88a7-94009527cfb7\" (UID: \"3e764201-18a2-418f-88a7-94009527cfb7\") " Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.636600 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-config" (OuterVolumeSpecName: "config") pod "3e764201-18a2-418f-88a7-94009527cfb7" (UID: "3e764201-18a2-418f-88a7-94009527cfb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.636818 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3e764201-18a2-418f-88a7-94009527cfb7" (UID: "3e764201-18a2-418f-88a7-94009527cfb7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.638838 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e764201-18a2-418f-88a7-94009527cfb7" (UID: "3e764201-18a2-418f-88a7-94009527cfb7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.653352 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e764201-18a2-418f-88a7-94009527cfb7-kube-api-access-njjsn" (OuterVolumeSpecName: "kube-api-access-njjsn") pod "3e764201-18a2-418f-88a7-94009527cfb7" (UID: "3e764201-18a2-418f-88a7-94009527cfb7"). InnerVolumeSpecName "kube-api-access-njjsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.655709 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vdj96"] Feb 19 09:00:37 crc kubenswrapper[4788]: W0219 09:00:37.670896 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8a9b729_65f6_40d1_94d6_149133edae05.slice/crio-95fcdd36e06ea0a0a63d9ad1453222447068efd9a571bd42e418e64684bc58d9 WatchSource:0}: Error finding container 95fcdd36e06ea0a0a63d9ad1453222447068efd9a571bd42e418e64684bc58d9: Status 404 returned error can't find the container with id 95fcdd36e06ea0a0a63d9ad1453222447068efd9a571bd42e418e64684bc58d9 Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.738579 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.738959 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.738975 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e764201-18a2-418f-88a7-94009527cfb7-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.739013 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njjsn\" (UniqueName: \"kubernetes.io/projected/3e764201-18a2-418f-88a7-94009527cfb7-kube-api-access-njjsn\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.862180 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 19 09:00:37 crc kubenswrapper[4788]: W0219 09:00:37.894447 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc371e63d_7b67_4bcf_a08d_292f3743c388.slice/crio-2b8f33e9f51e1fa9c38beccbfb2bf42e93f560fa53b850377d22f7f60aedd674 WatchSource:0}: Error finding container 2b8f33e9f51e1fa9c38beccbfb2bf42e93f560fa53b850377d22f7f60aedd674: Status 404 returned error can't find the container with id 2b8f33e9f51e1fa9c38beccbfb2bf42e93f560fa53b850377d22f7f60aedd674 Feb 19 09:00:37 crc kubenswrapper[4788]: I0219 09:00:37.988040 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-whwsd"] Feb 19 09:00:37 crc kubenswrapper[4788]: W0219 09:00:37.992941 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf0da7e1_775f_413a_a2fc_3ee4232048e0.slice/crio-ee74b244abc0908eef0b224f55fb6215cbb7fd9eba29453b4a34b26859b7848e WatchSource:0}: Error finding container ee74b244abc0908eef0b224f55fb6215cbb7fd9eba29453b4a34b26859b7848e: Status 404 returned error can't find the container with id ee74b244abc0908eef0b224f55fb6215cbb7fd9eba29453b4a34b26859b7848e Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.316969 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.317036 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.545178 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vdj96" event={"ID":"c8a9b729-65f6-40d1-94d6-149133edae05","Type":"ContainerStarted","Data":"bca1566748344378fafcf4d452df250f36f1f14380346aa0f02981adb802a28a"} Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.545234 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vdj96" event={"ID":"c8a9b729-65f6-40d1-94d6-149133edae05","Type":"ContainerStarted","Data":"95fcdd36e06ea0a0a63d9ad1453222447068efd9a571bd42e418e64684bc58d9"} Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.546631 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c371e63d-7b67-4bcf-a08d-292f3743c388","Type":"ContainerStarted","Data":"2b8f33e9f51e1fa9c38beccbfb2bf42e93f560fa53b850377d22f7f60aedd674"} Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.548280 4788 generic.go:334] "Generic (PLEG): container finished" podID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerID="2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147" exitCode=0 Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.548394 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4p7" event={"ID":"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3","Type":"ContainerDied","Data":"2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147"} Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.550060 4788 generic.go:334] "Generic (PLEG): container finished" podID="df0da7e1-775f-413a-a2fc-3ee4232048e0" containerID="bc9dad1c8b96583b095f43327a26f5b52928020b47328c06387a4db217612f2d" exitCode=0 Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.550125 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-cxxn7" Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.550224 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" event={"ID":"df0da7e1-775f-413a-a2fc-3ee4232048e0","Type":"ContainerDied","Data":"bc9dad1c8b96583b095f43327a26f5b52928020b47328c06387a4db217612f2d"} Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.550259 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" event={"ID":"df0da7e1-775f-413a-a2fc-3ee4232048e0","Type":"ContainerStarted","Data":"ee74b244abc0908eef0b224f55fb6215cbb7fd9eba29453b4a34b26859b7848e"} Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.570845 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-vdj96" podStartSLOduration=2.5708203149999997 podStartE2EDuration="2.570820315s" podCreationTimestamp="2026-02-19 09:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:00:38.564964977 +0000 UTC m=+940.552976449" watchObservedRunningTime="2026-02-19 09:00:38.570820315 +0000 UTC m=+940.558831797" Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.785090 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cxxn7"] Feb 19 09:00:38 crc kubenswrapper[4788]: I0219 09:00:38.791436 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-cxxn7"] Feb 19 09:00:39 crc kubenswrapper[4788]: I0219 09:00:39.560565 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4p7" event={"ID":"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3","Type":"ContainerStarted","Data":"731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb"} Feb 19 09:00:39 crc kubenswrapper[4788]: I0219 09:00:39.562406 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" event={"ID":"df0da7e1-775f-413a-a2fc-3ee4232048e0","Type":"ContainerStarted","Data":"f79f7f215457e3f62436a7880fc0f4e2d278d59ba21cabcab9023b3bc6dcf828"} Feb 19 09:00:39 crc kubenswrapper[4788]: I0219 09:00:39.562829 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:39 crc kubenswrapper[4788]: I0219 09:00:39.563880 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c371e63d-7b67-4bcf-a08d-292f3743c388","Type":"ContainerStarted","Data":"c0361fdcddea3bd205ec129ba17062922c0e9daa9f7c15e2ecafe9dd6a78c35c"} Feb 19 09:00:39 crc kubenswrapper[4788]: I0219 09:00:39.605760 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" podStartSLOduration=3.605733583 podStartE2EDuration="3.605733583s" podCreationTimestamp="2026-02-19 09:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:00:39.598194402 +0000 UTC m=+941.586205914" watchObservedRunningTime="2026-02-19 09:00:39.605733583 +0000 UTC m=+941.593745085" Feb 19 09:00:39 crc kubenswrapper[4788]: I0219 09:00:39.752855 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:39 crc kubenswrapper[4788]: I0219 09:00:39.753312 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:39 crc kubenswrapper[4788]: I0219 09:00:39.848527 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.276888 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.276980 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.333653 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.422934 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.526381 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.572670 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c371e63d-7b67-4bcf-a08d-292f3743c388","Type":"ContainerStarted","Data":"c76132e5325675c557baa6238133175cb7cd483765bb1466057049a06e9965ea"} Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.573088 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.575555 4788 generic.go:334] "Generic (PLEG): container finished" podID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerID="731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb" exitCode=0 Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.577650 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4p7" event={"ID":"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3","Type":"ContainerDied","Data":"731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb"} Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.601675 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.270270101 podStartE2EDuration="4.601654345s" podCreationTimestamp="2026-02-19 09:00:36 +0000 UTC" firstStartedPulling="2026-02-19 09:00:37.896168646 +0000 UTC m=+939.884180118" lastFinishedPulling="2026-02-19 09:00:39.22755289 +0000 UTC m=+941.215564362" observedRunningTime="2026-02-19 09:00:40.597935961 +0000 UTC m=+942.585947443" watchObservedRunningTime="2026-02-19 09:00:40.601654345 +0000 UTC m=+942.589665817" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.653336 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.666814 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 19 09:00:40 crc kubenswrapper[4788]: I0219 09:00:40.733453 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e764201-18a2-418f-88a7-94009527cfb7" path="/var/lib/kubelet/pods/3e764201-18a2-418f-88a7-94009527cfb7/volumes" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.206485 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ab60-account-create-update-755dc"] Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.207636 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.212008 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8tsc9"] Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.213018 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.213506 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.219310 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ab60-account-create-update-755dc"] Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.228070 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8tsc9"] Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.294941 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-2v24m"] Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.295933 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2v24m" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.303292 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2v24m"] Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.328378 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjcq7\" (UniqueName: \"kubernetes.io/projected/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-kube-api-access-tjcq7\") pod \"keystone-db-create-8tsc9\" (UID: \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\") " pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.328484 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgjtx\" (UniqueName: \"kubernetes.io/projected/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-kube-api-access-bgjtx\") pod \"keystone-ab60-account-create-update-755dc\" (UID: \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\") " pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.328510 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-operator-scripts\") pod \"keystone-ab60-account-create-update-755dc\" (UID: \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\") " pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.328632 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-operator-scripts\") pod \"keystone-db-create-8tsc9\" (UID: \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\") " pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.394884 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-4f7d-account-create-update-jpgd6"] Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.396600 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.399393 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.404222 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4f7d-account-create-update-jpgd6"] Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.430084 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-operator-scripts\") pod \"keystone-db-create-8tsc9\" (UID: \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\") " pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.430434 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjcq7\" (UniqueName: \"kubernetes.io/projected/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-kube-api-access-tjcq7\") pod \"keystone-db-create-8tsc9\" (UID: \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\") " pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.430617 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgjtx\" (UniqueName: \"kubernetes.io/projected/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-kube-api-access-bgjtx\") pod \"keystone-ab60-account-create-update-755dc\" (UID: \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\") " pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.430704 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-operator-scripts\") pod \"keystone-ab60-account-create-update-755dc\" (UID: \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\") " pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.430801 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkzwg\" (UniqueName: \"kubernetes.io/projected/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-kube-api-access-jkzwg\") pod \"placement-db-create-2v24m\" (UID: \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\") " pod="openstack/placement-db-create-2v24m" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.430907 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-operator-scripts\") pod \"placement-db-create-2v24m\" (UID: \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\") " pod="openstack/placement-db-create-2v24m" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.431445 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-operator-scripts\") pod \"keystone-db-create-8tsc9\" (UID: \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\") " pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.431678 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-operator-scripts\") pod \"keystone-ab60-account-create-update-755dc\" (UID: \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\") " pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.448667 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgjtx\" (UniqueName: \"kubernetes.io/projected/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-kube-api-access-bgjtx\") pod \"keystone-ab60-account-create-update-755dc\" (UID: \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\") " pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.449979 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjcq7\" (UniqueName: \"kubernetes.io/projected/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-kube-api-access-tjcq7\") pod \"keystone-db-create-8tsc9\" (UID: \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\") " pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.532552 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-operator-scripts\") pod \"placement-4f7d-account-create-update-jpgd6\" (UID: \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\") " pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.532665 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkzwg\" (UniqueName: \"kubernetes.io/projected/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-kube-api-access-jkzwg\") pod \"placement-db-create-2v24m\" (UID: \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\") " pod="openstack/placement-db-create-2v24m" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.532696 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plzth\" (UniqueName: \"kubernetes.io/projected/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-kube-api-access-plzth\") pod \"placement-4f7d-account-create-update-jpgd6\" (UID: \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\") " pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.532723 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-operator-scripts\") pod \"placement-db-create-2v24m\" (UID: \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\") " pod="openstack/placement-db-create-2v24m" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.533585 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-operator-scripts\") pod \"placement-db-create-2v24m\" (UID: \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\") " pod="openstack/placement-db-create-2v24m" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.540675 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.541527 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.548942 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkzwg\" (UniqueName: \"kubernetes.io/projected/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-kube-api-access-jkzwg\") pod \"placement-db-create-2v24m\" (UID: \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\") " pod="openstack/placement-db-create-2v24m" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.594830 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4p7" event={"ID":"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3","Type":"ContainerStarted","Data":"ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645"} Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.612607 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nc4p7" podStartSLOduration=3.047297866 podStartE2EDuration="5.612590965s" podCreationTimestamp="2026-02-19 09:00:36 +0000 UTC" firstStartedPulling="2026-02-19 09:00:38.549781092 +0000 UTC m=+940.537792564" lastFinishedPulling="2026-02-19 09:00:41.115074191 +0000 UTC m=+943.103085663" observedRunningTime="2026-02-19 09:00:41.610437621 +0000 UTC m=+943.598449103" watchObservedRunningTime="2026-02-19 09:00:41.612590965 +0000 UTC m=+943.600602437" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.618038 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2v24m" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.633865 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plzth\" (UniqueName: \"kubernetes.io/projected/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-kube-api-access-plzth\") pod \"placement-4f7d-account-create-update-jpgd6\" (UID: \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\") " pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.634773 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-operator-scripts\") pod \"placement-4f7d-account-create-update-jpgd6\" (UID: \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\") " pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.635614 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-operator-scripts\") pod \"placement-4f7d-account-create-update-jpgd6\" (UID: \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\") " pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.655185 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plzth\" (UniqueName: \"kubernetes.io/projected/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-kube-api-access-plzth\") pod \"placement-4f7d-account-create-update-jpgd6\" (UID: \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\") " pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:41 crc kubenswrapper[4788]: I0219 09:00:41.710251 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.113819 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8tsc9"] Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.127266 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ab60-account-create-update-755dc"] Feb 19 09:00:42 crc kubenswrapper[4788]: W0219 09:00:42.131389 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72c57601_e6f1_4092_9d4a_49e8c5cf38e3.slice/crio-7e42e8194544d421fb4c9367422d360254af4713b35abe4094fb6901764133e8 WatchSource:0}: Error finding container 7e42e8194544d421fb4c9367422d360254af4713b35abe4094fb6901764133e8: Status 404 returned error can't find the container with id 7e42e8194544d421fb4c9367422d360254af4713b35abe4094fb6901764133e8 Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.235746 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-whwsd"] Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.240689 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.304529 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-4f7d-account-create-update-jpgd6"] Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.310309 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwfcl"] Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.311590 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.328203 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwfcl"] Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.352934 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-config\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.352991 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8nwb\" (UniqueName: \"kubernetes.io/projected/9fa7a666-34c5-42b5-9c2b-25d39c505be2-kube-api-access-c8nwb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.353041 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.353097 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.353157 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-dns-svc\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.406340 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-2v24m"] Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.451119 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxzll"] Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.458254 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.459334 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-dns-svc\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.459950 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-dns-svc\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.460193 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.460368 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-config\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.460421 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8nwb\" (UniqueName: \"kubernetes.io/projected/9fa7a666-34c5-42b5-9c2b-25d39c505be2-kube-api-access-c8nwb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.464498 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.464937 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-config\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.465553 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.488454 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8nwb\" (UniqueName: \"kubernetes.io/projected/9fa7a666-34c5-42b5-9c2b-25d39c505be2-kube-api-access-c8nwb\") pod \"dnsmasq-dns-698758b865-bwfcl\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.489714 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.604992 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2v24m" event={"ID":"893edbf1-7994-499b-bd8e-45d7f3e9eb5f","Type":"ContainerStarted","Data":"ed551e75aecbf0bf6af6b254de8571265ffd86c9c070065a82ba3475f422de37"} Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.608395 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8tsc9" event={"ID":"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b","Type":"ContainerStarted","Data":"8048e78cd541778c58128d7cd30d64a40746f152fbf66e6a830537fc96bfadbf"} Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.608425 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8tsc9" event={"ID":"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b","Type":"ContainerStarted","Data":"27bd8fbee73dcb03d901324c52d34412f2f5c0fd8e86235300d10b4bb12d642c"} Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.633046 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4f7d-account-create-update-jpgd6" event={"ID":"d7aae0ad-43eb-43dd-b926-6b0847fa9eea","Type":"ContainerStarted","Data":"4f945a30cc7553f86b2abf7137b20a27b700b7fba8438fde330eaad1d1cac6bc"} Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.638072 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" podUID="df0da7e1-775f-413a-a2fc-3ee4232048e0" containerName="dnsmasq-dns" containerID="cri-o://f79f7f215457e3f62436a7880fc0f4e2d278d59ba21cabcab9023b3bc6dcf828" gracePeriod=10 Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.637966 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab60-account-create-update-755dc" event={"ID":"72c57601-e6f1-4092-9d4a-49e8c5cf38e3","Type":"ContainerStarted","Data":"e8c9a156a40c423ac06997fa3fefd0442bba40c9fb97dfe85d3ebffdb0a72307"} Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.638669 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab60-account-create-update-755dc" event={"ID":"72c57601-e6f1-4092-9d4a-49e8c5cf38e3","Type":"ContainerStarted","Data":"7e42e8194544d421fb4c9367422d360254af4713b35abe4094fb6901764133e8"} Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.641303 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-8tsc9" podStartSLOduration=1.6412848260000001 podStartE2EDuration="1.641284826s" podCreationTimestamp="2026-02-19 09:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:00:42.632887434 +0000 UTC m=+944.620898906" watchObservedRunningTime="2026-02-19 09:00:42.641284826 +0000 UTC m=+944.629296298" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.666015 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ab60-account-create-update-755dc" podStartSLOduration=1.665994402 podStartE2EDuration="1.665994402s" podCreationTimestamp="2026-02-19 09:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:00:42.664111154 +0000 UTC m=+944.652122626" watchObservedRunningTime="2026-02-19 09:00:42.665994402 +0000 UTC m=+944.654005874" Feb 19 09:00:42 crc kubenswrapper[4788]: I0219 09:00:42.979506 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwfcl"] Feb 19 09:00:42 crc kubenswrapper[4788]: W0219 09:00:42.992369 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fa7a666_34c5_42b5_9c2b_25d39c505be2.slice/crio-712f25f3544699b4ab45bf3cebc04e4bfcda7dc2eeacb8dbde130902f1d6f671 WatchSource:0}: Error finding container 712f25f3544699b4ab45bf3cebc04e4bfcda7dc2eeacb8dbde130902f1d6f671: Status 404 returned error can't find the container with id 712f25f3544699b4ab45bf3cebc04e4bfcda7dc2eeacb8dbde130902f1d6f671 Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.451812 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.465366 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.465527 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.471617 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.471917 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-k7qvx" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.472032 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.472142 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.481806 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.481858 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f91d0357-4651-41b6-a842-27d8c7f47e60-cache\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.481889 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91d0357-4651-41b6-a842-27d8c7f47e60-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.481929 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksnbf\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-kube-api-access-ksnbf\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.481987 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f91d0357-4651-41b6-a842-27d8c7f47e60-lock\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.482076 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.583132 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.583535 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.583556 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f91d0357-4651-41b6-a842-27d8c7f47e60-cache\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.583554 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.583577 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91d0357-4651-41b6-a842-27d8c7f47e60-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.583602 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksnbf\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-kube-api-access-ksnbf\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.583642 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f91d0357-4651-41b6-a842-27d8c7f47e60-lock\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: E0219 09:00:43.583663 4788 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 09:00:43 crc kubenswrapper[4788]: E0219 09:00:43.583680 4788 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 09:00:43 crc kubenswrapper[4788]: E0219 09:00:43.583717 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift podName:f91d0357-4651-41b6-a842-27d8c7f47e60 nodeName:}" failed. No retries permitted until 2026-02-19 09:00:44.083703363 +0000 UTC m=+946.071714835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift") pod "swift-storage-0" (UID: "f91d0357-4651-41b6-a842-27d8c7f47e60") : configmap "swift-ring-files" not found Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.584352 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/f91d0357-4651-41b6-a842-27d8c7f47e60-lock\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.584594 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/f91d0357-4651-41b6-a842-27d8c7f47e60-cache\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.592312 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f91d0357-4651-41b6-a842-27d8c7f47e60-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.606334 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksnbf\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-kube-api-access-ksnbf\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.608893 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.658626 4788 generic.go:334] "Generic (PLEG): container finished" podID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" containerID="6bd6ea079e989126512aaf5ce85ea03d4bb3cc021826f03dd8d39ec46a0829bb" exitCode=0 Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.658711 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwfcl" event={"ID":"9fa7a666-34c5-42b5-9c2b-25d39c505be2","Type":"ContainerDied","Data":"6bd6ea079e989126512aaf5ce85ea03d4bb3cc021826f03dd8d39ec46a0829bb"} Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.658753 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwfcl" event={"ID":"9fa7a666-34c5-42b5-9c2b-25d39c505be2","Type":"ContainerStarted","Data":"712f25f3544699b4ab45bf3cebc04e4bfcda7dc2eeacb8dbde130902f1d6f671"} Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.670047 4788 generic.go:334] "Generic (PLEG): container finished" podID="d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b" containerID="8048e78cd541778c58128d7cd30d64a40746f152fbf66e6a830537fc96bfadbf" exitCode=0 Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.670128 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8tsc9" event={"ID":"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b","Type":"ContainerDied","Data":"8048e78cd541778c58128d7cd30d64a40746f152fbf66e6a830537fc96bfadbf"} Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.677331 4788 generic.go:334] "Generic (PLEG): container finished" podID="d7aae0ad-43eb-43dd-b926-6b0847fa9eea" containerID="521a7ea9f60cd14ae42a92c828099e7e5154afe94d17ea85ffde3e2c24c46301" exitCode=0 Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.677421 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4f7d-account-create-update-jpgd6" event={"ID":"d7aae0ad-43eb-43dd-b926-6b0847fa9eea","Type":"ContainerDied","Data":"521a7ea9f60cd14ae42a92c828099e7e5154afe94d17ea85ffde3e2c24c46301"} Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.694868 4788 generic.go:334] "Generic (PLEG): container finished" podID="72c57601-e6f1-4092-9d4a-49e8c5cf38e3" containerID="e8c9a156a40c423ac06997fa3fefd0442bba40c9fb97dfe85d3ebffdb0a72307" exitCode=0 Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.695119 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab60-account-create-update-755dc" event={"ID":"72c57601-e6f1-4092-9d4a-49e8c5cf38e3","Type":"ContainerDied","Data":"e8c9a156a40c423ac06997fa3fefd0442bba40c9fb97dfe85d3ebffdb0a72307"} Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.702628 4788 generic.go:334] "Generic (PLEG): container finished" podID="df0da7e1-775f-413a-a2fc-3ee4232048e0" containerID="f79f7f215457e3f62436a7880fc0f4e2d278d59ba21cabcab9023b3bc6dcf828" exitCode=0 Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.702846 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" event={"ID":"df0da7e1-775f-413a-a2fc-3ee4232048e0","Type":"ContainerDied","Data":"f79f7f215457e3f62436a7880fc0f4e2d278d59ba21cabcab9023b3bc6dcf828"} Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.702968 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" event={"ID":"df0da7e1-775f-413a-a2fc-3ee4232048e0","Type":"ContainerDied","Data":"ee74b244abc0908eef0b224f55fb6215cbb7fd9eba29453b4a34b26859b7848e"} Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.703047 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee74b244abc0908eef0b224f55fb6215cbb7fd9eba29453b4a34b26859b7848e" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.719537 4788 generic.go:334] "Generic (PLEG): container finished" podID="893edbf1-7994-499b-bd8e-45d7f3e9eb5f" containerID="d35817ec6d70bb48295ca866918e33dd9f2c28cfe5e68556e823c24fd245486d" exitCode=0 Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.719793 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vxzll" podUID="e71e3f37-f071-4790-977c-915af714faf8" containerName="registry-server" containerID="cri-o://d5b736388630ef01dd35f5ee64b984608de37ff930e6cc9a8b72106a587ce63b" gracePeriod=2 Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.720142 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2v24m" event={"ID":"893edbf1-7994-499b-bd8e-45d7f3e9eb5f","Type":"ContainerDied","Data":"d35817ec6d70bb48295ca866918e33dd9f2c28cfe5e68556e823c24fd245486d"} Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.847887 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-tc7kz"] Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.850088 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.852839 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.853156 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.856558 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.857883 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.909582 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tc7kz"] Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.996940 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-dns-svc\") pod \"df0da7e1-775f-413a-a2fc-3ee4232048e0\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.997063 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5kj9\" (UniqueName: \"kubernetes.io/projected/df0da7e1-775f-413a-a2fc-3ee4232048e0-kube-api-access-q5kj9\") pod \"df0da7e1-775f-413a-a2fc-3ee4232048e0\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.997092 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-config\") pod \"df0da7e1-775f-413a-a2fc-3ee4232048e0\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.997190 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-nb\") pod \"df0da7e1-775f-413a-a2fc-3ee4232048e0\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.997225 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-sb\") pod \"df0da7e1-775f-413a-a2fc-3ee4232048e0\" (UID: \"df0da7e1-775f-413a-a2fc-3ee4232048e0\") " Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.999397 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-ring-data-devices\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.999439 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-swiftconf\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.999553 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-dispersionconf\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.999664 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm46r\" (UniqueName: \"kubernetes.io/projected/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-kube-api-access-mm46r\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.999729 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-etc-swift\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.999783 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-scripts\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:43 crc kubenswrapper[4788]: I0219 09:00:43.999857 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-combined-ca-bundle\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.035451 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0da7e1-775f-413a-a2fc-3ee4232048e0-kube-api-access-q5kj9" (OuterVolumeSpecName: "kube-api-access-q5kj9") pod "df0da7e1-775f-413a-a2fc-3ee4232048e0" (UID: "df0da7e1-775f-413a-a2fc-3ee4232048e0"). InnerVolumeSpecName "kube-api-access-q5kj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.041935 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "df0da7e1-775f-413a-a2fc-3ee4232048e0" (UID: "df0da7e1-775f-413a-a2fc-3ee4232048e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.041977 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df0da7e1-775f-413a-a2fc-3ee4232048e0" (UID: "df0da7e1-775f-413a-a2fc-3ee4232048e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.055063 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-config" (OuterVolumeSpecName: "config") pod "df0da7e1-775f-413a-a2fc-3ee4232048e0" (UID: "df0da7e1-775f-413a-a2fc-3ee4232048e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.071706 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df0da7e1-775f-413a-a2fc-3ee4232048e0" (UID: "df0da7e1-775f-413a-a2fc-3ee4232048e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.101367 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-etc-swift\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.101632 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-scripts\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.101714 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-combined-ca-bundle\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.101825 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-ring-data-devices\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.101904 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-swiftconf\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102003 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-dispersionconf\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102099 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm46r\" (UniqueName: \"kubernetes.io/projected/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-kube-api-access-mm46r\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102184 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102291 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102360 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5kj9\" (UniqueName: \"kubernetes.io/projected/df0da7e1-775f-413a-a2fc-3ee4232048e0-kube-api-access-q5kj9\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102422 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102480 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102535 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df0da7e1-775f-413a-a2fc-3ee4232048e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102541 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-ring-data-devices\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.101761 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-etc-swift\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: E0219 09:00:44.102614 4788 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 09:00:44 crc kubenswrapper[4788]: E0219 09:00:44.102769 4788 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.102483 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-scripts\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: E0219 09:00:44.102951 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift podName:f91d0357-4651-41b6-a842-27d8c7f47e60 nodeName:}" failed. No retries permitted until 2026-02-19 09:00:45.102935078 +0000 UTC m=+947.090946630 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift") pod "swift-storage-0" (UID: "f91d0357-4651-41b6-a842-27d8c7f47e60") : configmap "swift-ring-files" not found Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.104765 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-dispersionconf\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.104832 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-swiftconf\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.105020 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-combined-ca-bundle\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.118427 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm46r\" (UniqueName: \"kubernetes.io/projected/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-kube-api-access-mm46r\") pod \"swift-ring-rebalance-tc7kz\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:44 crc kubenswrapper[4788]: I0219 09:00:44.245212 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.701910 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-tc7kz"] Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.734715 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tc7kz" event={"ID":"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a","Type":"ContainerStarted","Data":"7f4021678ef13b6ba2637d33cad4359c32fe5be7746644ae020b9e833eb0264f"} Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.737280 4788 generic.go:334] "Generic (PLEG): container finished" podID="e71e3f37-f071-4790-977c-915af714faf8" containerID="d5b736388630ef01dd35f5ee64b984608de37ff930e6cc9a8b72106a587ce63b" exitCode=0 Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.737322 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxzll" event={"ID":"e71e3f37-f071-4790-977c-915af714faf8","Type":"ContainerDied","Data":"d5b736388630ef01dd35f5ee64b984608de37ff930e6cc9a8b72106a587ce63b"} Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.739133 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwfcl" event={"ID":"9fa7a666-34c5-42b5-9c2b-25d39c505be2","Type":"ContainerStarted","Data":"9396ef11e5bff8e56438a2fb5019baf319c491f2c99e39a05c7f25568cfddd0a"} Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.739395 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-whwsd" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.784035 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-bwfcl" podStartSLOduration=2.784012728 podStartE2EDuration="2.784012728s" podCreationTimestamp="2026-02-19 09:00:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:00:44.758900842 +0000 UTC m=+946.746912324" watchObservedRunningTime="2026-02-19 09:00:44.784012728 +0000 UTC m=+946.772024200" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.819234 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-whwsd"] Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.835868 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-whwsd"] Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:44.863526 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.015278 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-utilities\") pod \"e71e3f37-f071-4790-977c-915af714faf8\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.015351 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-catalog-content\") pod \"e71e3f37-f071-4790-977c-915af714faf8\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.015493 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg827\" (UniqueName: \"kubernetes.io/projected/e71e3f37-f071-4790-977c-915af714faf8-kube-api-access-gg827\") pod \"e71e3f37-f071-4790-977c-915af714faf8\" (UID: \"e71e3f37-f071-4790-977c-915af714faf8\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.017926 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-utilities" (OuterVolumeSpecName: "utilities") pod "e71e3f37-f071-4790-977c-915af714faf8" (UID: "e71e3f37-f071-4790-977c-915af714faf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.020914 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71e3f37-f071-4790-977c-915af714faf8-kube-api-access-gg827" (OuterVolumeSpecName: "kube-api-access-gg827") pod "e71e3f37-f071-4790-977c-915af714faf8" (UID: "e71e3f37-f071-4790-977c-915af714faf8"). InnerVolumeSpecName "kube-api-access-gg827". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.083010 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e71e3f37-f071-4790-977c-915af714faf8" (UID: "e71e3f37-f071-4790-977c-915af714faf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.117753 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.117830 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg827\" (UniqueName: \"kubernetes.io/projected/e71e3f37-f071-4790-977c-915af714faf8-kube-api-access-gg827\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.117842 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.117851 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e71e3f37-f071-4790-977c-915af714faf8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: E0219 09:00:45.117919 4788 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 09:00:45 crc kubenswrapper[4788]: E0219 09:00:45.117949 4788 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 09:00:45 crc kubenswrapper[4788]: E0219 09:00:45.117991 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift podName:f91d0357-4651-41b6-a842-27d8c7f47e60 nodeName:}" failed. No retries permitted until 2026-02-19 09:00:47.117977282 +0000 UTC m=+949.105988754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift") pod "swift-storage-0" (UID: "f91d0357-4651-41b6-a842-27d8c7f47e60") : configmap "swift-ring-files" not found Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.314709 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-q8z7r"] Feb 19 09:00:45 crc kubenswrapper[4788]: E0219 09:00:45.315101 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71e3f37-f071-4790-977c-915af714faf8" containerName="extract-utilities" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.315121 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71e3f37-f071-4790-977c-915af714faf8" containerName="extract-utilities" Feb 19 09:00:45 crc kubenswrapper[4788]: E0219 09:00:45.315139 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71e3f37-f071-4790-977c-915af714faf8" containerName="registry-server" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.315148 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71e3f37-f071-4790-977c-915af714faf8" containerName="registry-server" Feb 19 09:00:45 crc kubenswrapper[4788]: E0219 09:00:45.315163 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71e3f37-f071-4790-977c-915af714faf8" containerName="extract-content" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.315174 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71e3f37-f071-4790-977c-915af714faf8" containerName="extract-content" Feb 19 09:00:45 crc kubenswrapper[4788]: E0219 09:00:45.315209 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0da7e1-775f-413a-a2fc-3ee4232048e0" containerName="init" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.315216 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0da7e1-775f-413a-a2fc-3ee4232048e0" containerName="init" Feb 19 09:00:45 crc kubenswrapper[4788]: E0219 09:00:45.315253 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0da7e1-775f-413a-a2fc-3ee4232048e0" containerName="dnsmasq-dns" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.315262 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0da7e1-775f-413a-a2fc-3ee4232048e0" containerName="dnsmasq-dns" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.315484 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71e3f37-f071-4790-977c-915af714faf8" containerName="registry-server" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.315506 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0da7e1-775f-413a-a2fc-3ee4232048e0" containerName="dnsmasq-dns" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.316161 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.326353 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-q8z7r"] Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.413365 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1d7e-account-create-update-bhj4l"] Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.419665 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.422465 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws4rz\" (UniqueName: \"kubernetes.io/projected/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-kube-api-access-ws4rz\") pod \"glance-db-create-q8z7r\" (UID: \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\") " pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.422587 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-operator-scripts\") pod \"glance-db-create-q8z7r\" (UID: \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\") " pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.426127 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.430833 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1d7e-account-create-update-bhj4l"] Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.494809 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.525250 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws4rz\" (UniqueName: \"kubernetes.io/projected/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-kube-api-access-ws4rz\") pod \"glance-db-create-q8z7r\" (UID: \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\") " pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.525303 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qppxj\" (UniqueName: \"kubernetes.io/projected/73420a48-bf28-40cc-b232-fab14ef5745e-kube-api-access-qppxj\") pod \"glance-1d7e-account-create-update-bhj4l\" (UID: \"73420a48-bf28-40cc-b232-fab14ef5745e\") " pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.525548 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-operator-scripts\") pod \"glance-db-create-q8z7r\" (UID: \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\") " pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.525589 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73420a48-bf28-40cc-b232-fab14ef5745e-operator-scripts\") pod \"glance-1d7e-account-create-update-bhj4l\" (UID: \"73420a48-bf28-40cc-b232-fab14ef5745e\") " pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.526614 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-operator-scripts\") pod \"glance-db-create-q8z7r\" (UID: \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\") " pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.547027 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws4rz\" (UniqueName: \"kubernetes.io/projected/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-kube-api-access-ws4rz\") pod \"glance-db-create-q8z7r\" (UID: \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\") " pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.626912 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-operator-scripts\") pod \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\" (UID: \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.627401 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjcq7\" (UniqueName: \"kubernetes.io/projected/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-kube-api-access-tjcq7\") pod \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\" (UID: \"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.627739 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73420a48-bf28-40cc-b232-fab14ef5745e-operator-scripts\") pod \"glance-1d7e-account-create-update-bhj4l\" (UID: \"73420a48-bf28-40cc-b232-fab14ef5745e\") " pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.627799 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qppxj\" (UniqueName: \"kubernetes.io/projected/73420a48-bf28-40cc-b232-fab14ef5745e-kube-api-access-qppxj\") pod \"glance-1d7e-account-create-update-bhj4l\" (UID: \"73420a48-bf28-40cc-b232-fab14ef5745e\") " pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.630217 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b" (UID: "d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.630937 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73420a48-bf28-40cc-b232-fab14ef5745e-operator-scripts\") pod \"glance-1d7e-account-create-update-bhj4l\" (UID: \"73420a48-bf28-40cc-b232-fab14ef5745e\") " pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.631936 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.634416 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-kube-api-access-tjcq7" (OuterVolumeSpecName: "kube-api-access-tjcq7") pod "d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b" (UID: "d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b"). InnerVolumeSpecName "kube-api-access-tjcq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.649894 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qppxj\" (UniqueName: \"kubernetes.io/projected/73420a48-bf28-40cc-b232-fab14ef5745e-kube-api-access-qppxj\") pod \"glance-1d7e-account-create-update-bhj4l\" (UID: \"73420a48-bf28-40cc-b232-fab14ef5745e\") " pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.651785 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2v24m" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.660102 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.676984 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.729436 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-operator-scripts\") pod \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\" (UID: \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.729493 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-operator-scripts\") pod \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\" (UID: \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.729625 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkzwg\" (UniqueName: \"kubernetes.io/projected/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-kube-api-access-jkzwg\") pod \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\" (UID: \"893edbf1-7994-499b-bd8e-45d7f3e9eb5f\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.729728 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plzth\" (UniqueName: \"kubernetes.io/projected/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-kube-api-access-plzth\") pod \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\" (UID: \"d7aae0ad-43eb-43dd-b926-6b0847fa9eea\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.730250 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjcq7\" (UniqueName: \"kubernetes.io/projected/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-kube-api-access-tjcq7\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.730287 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.734614 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7aae0ad-43eb-43dd-b926-6b0847fa9eea" (UID: "d7aae0ad-43eb-43dd-b926-6b0847fa9eea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.734871 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-kube-api-access-plzth" (OuterVolumeSpecName: "kube-api-access-plzth") pod "d7aae0ad-43eb-43dd-b926-6b0847fa9eea" (UID: "d7aae0ad-43eb-43dd-b926-6b0847fa9eea"). InnerVolumeSpecName "kube-api-access-plzth". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.737706 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "893edbf1-7994-499b-bd8e-45d7f3e9eb5f" (UID: "893edbf1-7994-499b-bd8e-45d7f3e9eb5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.742140 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-kube-api-access-jkzwg" (OuterVolumeSpecName: "kube-api-access-jkzwg") pod "893edbf1-7994-499b-bd8e-45d7f3e9eb5f" (UID: "893edbf1-7994-499b-bd8e-45d7f3e9eb5f"). InnerVolumeSpecName "kube-api-access-jkzwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.749938 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-4f7d-account-create-update-jpgd6" event={"ID":"d7aae0ad-43eb-43dd-b926-6b0847fa9eea","Type":"ContainerDied","Data":"4f945a30cc7553f86b2abf7137b20a27b700b7fba8438fde330eaad1d1cac6bc"} Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.749957 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-4f7d-account-create-update-jpgd6" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.749973 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f945a30cc7553f86b2abf7137b20a27b700b7fba8438fde330eaad1d1cac6bc" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.756545 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ab60-account-create-update-755dc" event={"ID":"72c57601-e6f1-4092-9d4a-49e8c5cf38e3","Type":"ContainerDied","Data":"7e42e8194544d421fb4c9367422d360254af4713b35abe4094fb6901764133e8"} Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.756578 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e42e8194544d421fb4c9367422d360254af4713b35abe4094fb6901764133e8" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.756629 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ab60-account-create-update-755dc" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.758293 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-2v24m" event={"ID":"893edbf1-7994-499b-bd8e-45d7f3e9eb5f","Type":"ContainerDied","Data":"ed551e75aecbf0bf6af6b254de8571265ffd86c9c070065a82ba3475f422de37"} Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.758315 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed551e75aecbf0bf6af6b254de8571265ffd86c9c070065a82ba3475f422de37" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.758351 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-2v24m" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.777429 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxzll" event={"ID":"e71e3f37-f071-4790-977c-915af714faf8","Type":"ContainerDied","Data":"cb7f4d07d81b9fb8eec72fa8b75130880feae6b18d4a831114789881d0aef69b"} Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.777551 4788 scope.go:117] "RemoveContainer" containerID="d5b736388630ef01dd35f5ee64b984608de37ff930e6cc9a8b72106a587ce63b" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.778013 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxzll" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.791051 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8tsc9" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.791139 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8tsc9" event={"ID":"d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b","Type":"ContainerDied","Data":"27bd8fbee73dcb03d901324c52d34412f2f5c0fd8e86235300d10b4bb12d642c"} Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.791185 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27bd8fbee73dcb03d901324c52d34412f2f5c0fd8e86235300d10b4bb12d642c" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.792524 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.812188 4788 scope.go:117] "RemoveContainer" containerID="ee954c1748690c0ec9ad95d4b7cad546a42b42dc0cebfd7025df44eeb6acfc0f" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.831869 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-operator-scripts\") pod \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\" (UID: \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.831939 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgjtx\" (UniqueName: \"kubernetes.io/projected/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-kube-api-access-bgjtx\") pod \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\" (UID: \"72c57601-e6f1-4092-9d4a-49e8c5cf38e3\") " Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.832421 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.832443 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkzwg\" (UniqueName: \"kubernetes.io/projected/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-kube-api-access-jkzwg\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.832458 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plzth\" (UniqueName: \"kubernetes.io/projected/d7aae0ad-43eb-43dd-b926-6b0847fa9eea-kube-api-access-plzth\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.832469 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/893edbf1-7994-499b-bd8e-45d7f3e9eb5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.834632 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72c57601-e6f1-4092-9d4a-49e8c5cf38e3" (UID: "72c57601-e6f1-4092-9d4a-49e8c5cf38e3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.834816 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.838548 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-kube-api-access-bgjtx" (OuterVolumeSpecName: "kube-api-access-bgjtx") pod "72c57601-e6f1-4092-9d4a-49e8c5cf38e3" (UID: "72c57601-e6f1-4092-9d4a-49e8c5cf38e3"). InnerVolumeSpecName "kube-api-access-bgjtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.899949 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.912528 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxzll"] Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.933605 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.933629 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgjtx\" (UniqueName: \"kubernetes.io/projected/72c57601-e6f1-4092-9d4a-49e8c5cf38e3-kube-api-access-bgjtx\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.948395 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vxzll"] Feb 19 09:00:45 crc kubenswrapper[4788]: I0219 09:00:45.980940 4788 scope.go:117] "RemoveContainer" containerID="14139638b12f921d11e81828b9b1bb8df1ec2c0a469bbb53a52a7be82ec67dd4" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.312639 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-q8z7r"] Feb 19 09:00:46 crc kubenswrapper[4788]: W0219 09:00:46.322543 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1e0aea8_b2f5_42f4_ab90_77423a7832ce.slice/crio-0014c853c3d8ba8c702acab246514a7d83ef4105c2dd33d2f7294a6bc806e1b4 WatchSource:0}: Error finding container 0014c853c3d8ba8c702acab246514a7d83ef4105c2dd33d2f7294a6bc806e1b4: Status 404 returned error can't find the container with id 0014c853c3d8ba8c702acab246514a7d83ef4105c2dd33d2f7294a6bc806e1b4 Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.443320 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1d7e-account-create-update-bhj4l"] Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.673499 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.673821 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.724395 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0da7e1-775f-413a-a2fc-3ee4232048e0" path="/var/lib/kubelet/pods/df0da7e1-775f-413a-a2fc-3ee4232048e0/volumes" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.730070 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71e3f37-f071-4790-977c-915af714faf8" path="/var/lib/kubelet/pods/e71e3f37-f071-4790-977c-915af714faf8/volumes" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.804954 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q8z7r" event={"ID":"f1e0aea8-b2f5-42f4-ab90-77423a7832ce","Type":"ContainerStarted","Data":"0014c853c3d8ba8c702acab246514a7d83ef4105c2dd33d2f7294a6bc806e1b4"} Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.811090 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d7e-account-create-update-bhj4l" event={"ID":"73420a48-bf28-40cc-b232-fab14ef5745e","Type":"ContainerStarted","Data":"a3cc9c3922978ff94c192223c9363e2b3080930bdd703767fa989ac6780645dc"} Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.976893 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k952c"] Feb 19 09:00:46 crc kubenswrapper[4788]: E0219 09:00:46.977206 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c57601-e6f1-4092-9d4a-49e8c5cf38e3" containerName="mariadb-account-create-update" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977217 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c57601-e6f1-4092-9d4a-49e8c5cf38e3" containerName="mariadb-account-create-update" Feb 19 09:00:46 crc kubenswrapper[4788]: E0219 09:00:46.977233 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b" containerName="mariadb-database-create" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977239 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b" containerName="mariadb-database-create" Feb 19 09:00:46 crc kubenswrapper[4788]: E0219 09:00:46.977264 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="893edbf1-7994-499b-bd8e-45d7f3e9eb5f" containerName="mariadb-database-create" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977271 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="893edbf1-7994-499b-bd8e-45d7f3e9eb5f" containerName="mariadb-database-create" Feb 19 09:00:46 crc kubenswrapper[4788]: E0219 09:00:46.977287 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7aae0ad-43eb-43dd-b926-6b0847fa9eea" containerName="mariadb-account-create-update" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977294 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7aae0ad-43eb-43dd-b926-6b0847fa9eea" containerName="mariadb-account-create-update" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977425 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="893edbf1-7994-499b-bd8e-45d7f3e9eb5f" containerName="mariadb-database-create" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977444 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b" containerName="mariadb-database-create" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977452 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7aae0ad-43eb-43dd-b926-6b0847fa9eea" containerName="mariadb-account-create-update" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977462 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c57601-e6f1-4092-9d4a-49e8c5cf38e3" containerName="mariadb-account-create-update" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.977980 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k952c" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.984265 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 19 09:00:46 crc kubenswrapper[4788]: I0219 09:00:46.995424 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k952c"] Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.055705 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfp7d\" (UniqueName: \"kubernetes.io/projected/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-kube-api-access-kfp7d\") pod \"root-account-create-update-k952c\" (UID: \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\") " pod="openstack/root-account-create-update-k952c" Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.055874 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-operator-scripts\") pod \"root-account-create-update-k952c\" (UID: \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\") " pod="openstack/root-account-create-update-k952c" Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.157406 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.157517 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfp7d\" (UniqueName: \"kubernetes.io/projected/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-kube-api-access-kfp7d\") pod \"root-account-create-update-k952c\" (UID: \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\") " pod="openstack/root-account-create-update-k952c" Feb 19 09:00:47 crc kubenswrapper[4788]: E0219 09:00:47.157590 4788 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 09:00:47 crc kubenswrapper[4788]: E0219 09:00:47.157620 4788 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 09:00:47 crc kubenswrapper[4788]: E0219 09:00:47.157671 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift podName:f91d0357-4651-41b6-a842-27d8c7f47e60 nodeName:}" failed. No retries permitted until 2026-02-19 09:00:51.157655236 +0000 UTC m=+953.145666708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift") pod "swift-storage-0" (UID: "f91d0357-4651-41b6-a842-27d8c7f47e60") : configmap "swift-ring-files" not found Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.157600 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-operator-scripts\") pod \"root-account-create-update-k952c\" (UID: \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\") " pod="openstack/root-account-create-update-k952c" Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.158349 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-operator-scripts\") pod \"root-account-create-update-k952c\" (UID: \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\") " pod="openstack/root-account-create-update-k952c" Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.183247 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfp7d\" (UniqueName: \"kubernetes.io/projected/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-kube-api-access-kfp7d\") pod \"root-account-create-update-k952c\" (UID: \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\") " pod="openstack/root-account-create-update-k952c" Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.295419 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k952c" Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.738041 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nc4p7" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="registry-server" probeResult="failure" output=< Feb 19 09:00:47 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:00:47 crc kubenswrapper[4788]: > Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.757869 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k952c"] Feb 19 09:00:47 crc kubenswrapper[4788]: W0219 09:00:47.766422 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbcb1ba9_aaef_4fcf_9caa_e9570e97ea66.slice/crio-8413c3bff332376fb094902c0ff25aa8d85531a3c3324509c8b9dca45f3a3525 WatchSource:0}: Error finding container 8413c3bff332376fb094902c0ff25aa8d85531a3c3324509c8b9dca45f3a3525: Status 404 returned error can't find the container with id 8413c3bff332376fb094902c0ff25aa8d85531a3c3324509c8b9dca45f3a3525 Feb 19 09:00:47 crc kubenswrapper[4788]: I0219 09:00:47.818438 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k952c" event={"ID":"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66","Type":"ContainerStarted","Data":"8413c3bff332376fb094902c0ff25aa8d85531a3c3324509c8b9dca45f3a3525"} Feb 19 09:00:48 crc kubenswrapper[4788]: I0219 09:00:48.326970 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9l9sr"] Feb 19 09:00:48 crc kubenswrapper[4788]: I0219 09:00:48.328571 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9l9sr" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerName="registry-server" containerID="cri-o://ef989681e48ec05d6020c8ee3961f4d477857a509033ba8dc90733e6c0d7d9ae" gracePeriod=2 Feb 19 09:00:48 crc kubenswrapper[4788]: I0219 09:00:48.826520 4788 generic.go:334] "Generic (PLEG): container finished" podID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerID="ef989681e48ec05d6020c8ee3961f4d477857a509033ba8dc90733e6c0d7d9ae" exitCode=0 Feb 19 09:00:48 crc kubenswrapper[4788]: I0219 09:00:48.826581 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9sr" event={"ID":"e2f5fb48-2371-47b0-be72-9521d37955bc","Type":"ContainerDied","Data":"ef989681e48ec05d6020c8ee3961f4d477857a509033ba8dc90733e6c0d7d9ae"} Feb 19 09:00:49 crc kubenswrapper[4788]: I0219 09:00:49.834923 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q8z7r" event={"ID":"f1e0aea8-b2f5-42f4-ab90-77423a7832ce","Type":"ContainerStarted","Data":"72a01b937262c26236c20bc0835900b832146b77519d07b8221c1e916fd1b451"} Feb 19 09:00:49 crc kubenswrapper[4788]: I0219 09:00:49.836396 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d7e-account-create-update-bhj4l" event={"ID":"73420a48-bf28-40cc-b232-fab14ef5745e","Type":"ContainerStarted","Data":"91dca1f13e60bcae0ab1d1fb4d153cbe06f24e26897618a01929ca3b69b758ab"} Feb 19 09:00:50 crc kubenswrapper[4788]: I0219 09:00:50.847160 4788 generic.go:334] "Generic (PLEG): container finished" podID="dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66" containerID="8c7107ad9f27bebbeef0be275cdba9100ba79e714caa256853032f1e9accd1be" exitCode=0 Feb 19 09:00:50 crc kubenswrapper[4788]: I0219 09:00:50.847259 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k952c" event={"ID":"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66","Type":"ContainerDied","Data":"8c7107ad9f27bebbeef0be275cdba9100ba79e714caa256853032f1e9accd1be"} Feb 19 09:00:50 crc kubenswrapper[4788]: I0219 09:00:50.848842 4788 generic.go:334] "Generic (PLEG): container finished" podID="73420a48-bf28-40cc-b232-fab14ef5745e" containerID="91dca1f13e60bcae0ab1d1fb4d153cbe06f24e26897618a01929ca3b69b758ab" exitCode=0 Feb 19 09:00:50 crc kubenswrapper[4788]: I0219 09:00:50.848905 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d7e-account-create-update-bhj4l" event={"ID":"73420a48-bf28-40cc-b232-fab14ef5745e","Type":"ContainerDied","Data":"91dca1f13e60bcae0ab1d1fb4d153cbe06f24e26897618a01929ca3b69b758ab"} Feb 19 09:00:50 crc kubenswrapper[4788]: I0219 09:00:50.851090 4788 generic.go:334] "Generic (PLEG): container finished" podID="f1e0aea8-b2f5-42f4-ab90-77423a7832ce" containerID="72a01b937262c26236c20bc0835900b832146b77519d07b8221c1e916fd1b451" exitCode=0 Feb 19 09:00:50 crc kubenswrapper[4788]: I0219 09:00:50.851132 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q8z7r" event={"ID":"f1e0aea8-b2f5-42f4-ab90-77423a7832ce","Type":"ContainerDied","Data":"72a01b937262c26236c20bc0835900b832146b77519d07b8221c1e916fd1b451"} Feb 19 09:00:51 crc kubenswrapper[4788]: I0219 09:00:51.163750 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:51 crc kubenswrapper[4788]: E0219 09:00:51.163875 4788 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 09:00:51 crc kubenswrapper[4788]: E0219 09:00:51.163892 4788 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 09:00:51 crc kubenswrapper[4788]: E0219 09:00:51.163947 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift podName:f91d0357-4651-41b6-a842-27d8c7f47e60 nodeName:}" failed. No retries permitted until 2026-02-19 09:00:59.163931753 +0000 UTC m=+961.151943225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift") pod "swift-storage-0" (UID: "f91d0357-4651-41b6-a842-27d8c7f47e60") : configmap "swift-ring-files" not found Feb 19 09:00:52 crc kubenswrapper[4788]: I0219 09:00:52.139499 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:00:52 crc kubenswrapper[4788]: I0219 09:00:52.140122 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:00:52 crc kubenswrapper[4788]: I0219 09:00:52.140175 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:00:52 crc kubenswrapper[4788]: I0219 09:00:52.142377 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5bb01dc9098ceeecfb8b55d79c5464f45f8f2b74f74a77633116b07488417cd"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:00:52 crc kubenswrapper[4788]: I0219 09:00:52.142445 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://d5bb01dc9098ceeecfb8b55d79c5464f45f8f2b74f74a77633116b07488417cd" gracePeriod=600 Feb 19 09:00:52 crc kubenswrapper[4788]: I0219 09:00:52.492558 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:00:52 crc kubenswrapper[4788]: I0219 09:00:52.559965 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8lk8v"] Feb 19 09:00:52 crc kubenswrapper[4788]: I0219 09:00:52.560180 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" podUID="23ee4096-6b61-438e-a3e1-ba9e720abd80" containerName="dnsmasq-dns" containerID="cri-o://143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d" gracePeriod=10 Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.388563 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.407027 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k952c" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.413630 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.449206 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.512801 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-operator-scripts\") pod \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\" (UID: \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.512845 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-operator-scripts\") pod \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\" (UID: \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.512920 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73420a48-bf28-40cc-b232-fab14ef5745e-operator-scripts\") pod \"73420a48-bf28-40cc-b232-fab14ef5745e\" (UID: \"73420a48-bf28-40cc-b232-fab14ef5745e\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.512961 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qppxj\" (UniqueName: \"kubernetes.io/projected/73420a48-bf28-40cc-b232-fab14ef5745e-kube-api-access-qppxj\") pod \"73420a48-bf28-40cc-b232-fab14ef5745e\" (UID: \"73420a48-bf28-40cc-b232-fab14ef5745e\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.513040 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfp7d\" (UniqueName: \"kubernetes.io/projected/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-kube-api-access-kfp7d\") pod \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\" (UID: \"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.513100 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws4rz\" (UniqueName: \"kubernetes.io/projected/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-kube-api-access-ws4rz\") pod \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\" (UID: \"f1e0aea8-b2f5-42f4-ab90-77423a7832ce\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.516927 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73420a48-bf28-40cc-b232-fab14ef5745e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73420a48-bf28-40cc-b232-fab14ef5745e" (UID: "73420a48-bf28-40cc-b232-fab14ef5745e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.519109 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66" (UID: "dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.519476 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f1e0aea8-b2f5-42f4-ab90-77423a7832ce" (UID: "f1e0aea8-b2f5-42f4-ab90-77423a7832ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.522024 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73420a48-bf28-40cc-b232-fab14ef5745e-kube-api-access-qppxj" (OuterVolumeSpecName: "kube-api-access-qppxj") pod "73420a48-bf28-40cc-b232-fab14ef5745e" (UID: "73420a48-bf28-40cc-b232-fab14ef5745e"). InnerVolumeSpecName "kube-api-access-qppxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.523372 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-kube-api-access-ws4rz" (OuterVolumeSpecName: "kube-api-access-ws4rz") pod "f1e0aea8-b2f5-42f4-ab90-77423a7832ce" (UID: "f1e0aea8-b2f5-42f4-ab90-77423a7832ce"). InnerVolumeSpecName "kube-api-access-ws4rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.525868 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-kube-api-access-kfp7d" (OuterVolumeSpecName: "kube-api-access-kfp7d") pod "dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66" (UID: "dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66"). InnerVolumeSpecName "kube-api-access-kfp7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.542637 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614272 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-utilities\") pod \"e2f5fb48-2371-47b0-be72-9521d37955bc\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614372 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-config\") pod \"23ee4096-6b61-438e-a3e1-ba9e720abd80\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614422 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-catalog-content\") pod \"e2f5fb48-2371-47b0-be72-9521d37955bc\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614447 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kgwn\" (UniqueName: \"kubernetes.io/projected/23ee4096-6b61-438e-a3e1-ba9e720abd80-kube-api-access-2kgwn\") pod \"23ee4096-6b61-438e-a3e1-ba9e720abd80\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614552 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9z6c\" (UniqueName: \"kubernetes.io/projected/e2f5fb48-2371-47b0-be72-9521d37955bc-kube-api-access-j9z6c\") pod \"e2f5fb48-2371-47b0-be72-9521d37955bc\" (UID: \"e2f5fb48-2371-47b0-be72-9521d37955bc\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614574 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-dns-svc\") pod \"23ee4096-6b61-438e-a3e1-ba9e720abd80\" (UID: \"23ee4096-6b61-438e-a3e1-ba9e720abd80\") " Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614884 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws4rz\" (UniqueName: \"kubernetes.io/projected/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-kube-api-access-ws4rz\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614902 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614912 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1e0aea8-b2f5-42f4-ab90-77423a7832ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614921 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73420a48-bf28-40cc-b232-fab14ef5745e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614930 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qppxj\" (UniqueName: \"kubernetes.io/projected/73420a48-bf28-40cc-b232-fab14ef5745e-kube-api-access-qppxj\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.614938 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfp7d\" (UniqueName: \"kubernetes.io/projected/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66-kube-api-access-kfp7d\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.615067 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-utilities" (OuterVolumeSpecName: "utilities") pod "e2f5fb48-2371-47b0-be72-9521d37955bc" (UID: "e2f5fb48-2371-47b0-be72-9521d37955bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.619669 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f5fb48-2371-47b0-be72-9521d37955bc-kube-api-access-j9z6c" (OuterVolumeSpecName: "kube-api-access-j9z6c") pod "e2f5fb48-2371-47b0-be72-9521d37955bc" (UID: "e2f5fb48-2371-47b0-be72-9521d37955bc"). InnerVolumeSpecName "kube-api-access-j9z6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.631998 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ee4096-6b61-438e-a3e1-ba9e720abd80-kube-api-access-2kgwn" (OuterVolumeSpecName: "kube-api-access-2kgwn") pod "23ee4096-6b61-438e-a3e1-ba9e720abd80" (UID: "23ee4096-6b61-438e-a3e1-ba9e720abd80"). InnerVolumeSpecName "kube-api-access-2kgwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.654321 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23ee4096-6b61-438e-a3e1-ba9e720abd80" (UID: "23ee4096-6b61-438e-a3e1-ba9e720abd80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.655298 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-config" (OuterVolumeSpecName: "config") pod "23ee4096-6b61-438e-a3e1-ba9e720abd80" (UID: "23ee4096-6b61-438e-a3e1-ba9e720abd80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.661294 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2f5fb48-2371-47b0-be72-9521d37955bc" (UID: "e2f5fb48-2371-47b0-be72-9521d37955bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.716891 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.716919 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.716929 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23ee4096-6b61-438e-a3e1-ba9e720abd80-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.716937 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2f5fb48-2371-47b0-be72-9521d37955bc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.716947 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kgwn\" (UniqueName: \"kubernetes.io/projected/23ee4096-6b61-438e-a3e1-ba9e720abd80-kube-api-access-2kgwn\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.716956 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9z6c\" (UniqueName: \"kubernetes.io/projected/e2f5fb48-2371-47b0-be72-9521d37955bc-kube-api-access-j9z6c\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.879495 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9l9sr" event={"ID":"e2f5fb48-2371-47b0-be72-9521d37955bc","Type":"ContainerDied","Data":"c438969f7705c765c34b80811523a8a21910a2b7907d59bfd1eca688d6e23682"} Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.880135 4788 scope.go:117] "RemoveContainer" containerID="ef989681e48ec05d6020c8ee3961f4d477857a509033ba8dc90733e6c0d7d9ae" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.879699 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9l9sr" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.882079 4788 generic.go:334] "Generic (PLEG): container finished" podID="23ee4096-6b61-438e-a3e1-ba9e720abd80" containerID="143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d" exitCode=0 Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.882135 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" event={"ID":"23ee4096-6b61-438e-a3e1-ba9e720abd80","Type":"ContainerDied","Data":"143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d"} Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.882160 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" event={"ID":"23ee4096-6b61-438e-a3e1-ba9e720abd80","Type":"ContainerDied","Data":"d5ebaa688224516945264c61a8788f393cf41095589e0732da83ddd01dd8142c"} Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.882223 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8lk8v" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.915079 4788 scope.go:117] "RemoveContainer" containerID="3c7bf5b956189bc39d2dc49af684ca8e7b1d9f8526c2fcf96f94799c563cb4c9" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.921046 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1d7e-account-create-update-bhj4l" event={"ID":"73420a48-bf28-40cc-b232-fab14ef5745e","Type":"ContainerDied","Data":"a3cc9c3922978ff94c192223c9363e2b3080930bdd703767fa989ac6780645dc"} Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.921396 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3cc9c3922978ff94c192223c9363e2b3080930bdd703767fa989ac6780645dc" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.921514 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1d7e-account-create-update-bhj4l" Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.948311 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9l9sr"] Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.964619 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="d5bb01dc9098ceeecfb8b55d79c5464f45f8f2b74f74a77633116b07488417cd" exitCode=0 Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.964718 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"d5bb01dc9098ceeecfb8b55d79c5464f45f8f2b74f74a77633116b07488417cd"} Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.964745 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"31cfa590dbe60cb7189f587f667407f74d6387f19ad0205b2e674711ceebc406"} Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.989723 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9l9sr"] Feb 19 09:00:53 crc kubenswrapper[4788]: I0219 09:00:53.990402 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tc7kz" event={"ID":"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a","Type":"ContainerStarted","Data":"980e7ed169d0e9890bf637663263e4e864a0ae3a412e6bb408a0e96c94406d57"} Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.005285 4788 scope.go:117] "RemoveContainer" containerID="a3f9eb95bd8f928b95b5e34cd98b9b17c41dc3419a949d2048889d57f5bc7fc8" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.030186 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-q8z7r" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.030762 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-q8z7r" event={"ID":"f1e0aea8-b2f5-42f4-ab90-77423a7832ce","Type":"ContainerDied","Data":"0014c853c3d8ba8c702acab246514a7d83ef4105c2dd33d2f7294a6bc806e1b4"} Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.030805 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0014c853c3d8ba8c702acab246514a7d83ef4105c2dd33d2f7294a6bc806e1b4" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.038863 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8lk8v"] Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.040554 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k952c" event={"ID":"dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66","Type":"ContainerDied","Data":"8413c3bff332376fb094902c0ff25aa8d85531a3c3324509c8b9dca45f3a3525"} Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.040600 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8413c3bff332376fb094902c0ff25aa8d85531a3c3324509c8b9dca45f3a3525" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.040692 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k952c" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.053328 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8lk8v"] Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.064625 4788 scope.go:117] "RemoveContainer" containerID="143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.070882 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-tc7kz" podStartSLOduration=2.469675665 podStartE2EDuration="11.07086451s" podCreationTimestamp="2026-02-19 09:00:43 +0000 UTC" firstStartedPulling="2026-02-19 09:00:44.703228983 +0000 UTC m=+946.691240455" lastFinishedPulling="2026-02-19 09:00:53.304417828 +0000 UTC m=+955.292429300" observedRunningTime="2026-02-19 09:00:54.065915005 +0000 UTC m=+956.053926487" watchObservedRunningTime="2026-02-19 09:00:54.07086451 +0000 UTC m=+956.058875982" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.083411 4788 scope.go:117] "RemoveContainer" containerID="0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.105910 4788 scope.go:117] "RemoveContainer" containerID="143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d" Feb 19 09:00:54 crc kubenswrapper[4788]: E0219 09:00:54.106399 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d\": container with ID starting with 143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d not found: ID does not exist" containerID="143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.106502 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d"} err="failed to get container status \"143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d\": rpc error: code = NotFound desc = could not find container \"143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d\": container with ID starting with 143b2efedf1c124eedc2bb3f71b9b7fc342455018cc68783f7d2b10a9048353d not found: ID does not exist" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.106584 4788 scope.go:117] "RemoveContainer" containerID="0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5" Feb 19 09:00:54 crc kubenswrapper[4788]: E0219 09:00:54.106896 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5\": container with ID starting with 0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5 not found: ID does not exist" containerID="0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.106985 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5"} err="failed to get container status \"0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5\": rpc error: code = NotFound desc = could not find container \"0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5\": container with ID starting with 0c752a8f59745466ad69d0b62dd9fdeccd40a7c9a3aba3ed99f6838c5c4154a5 not found: ID does not exist" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.107062 4788 scope.go:117] "RemoveContainer" containerID="7e35f3f513564bff5ea2198c67409383ba481d995f59e4d674440d785743deb5" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.724710 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ee4096-6b61-438e-a3e1-ba9e720abd80" path="/var/lib/kubelet/pods/23ee4096-6b61-438e-a3e1-ba9e720abd80/volumes" Feb 19 09:00:54 crc kubenswrapper[4788]: I0219 09:00:54.725597 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" path="/var/lib/kubelet/pods/e2f5fb48-2371-47b0-be72-9521d37955bc/volumes" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.623238 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jq4z9"] Feb 19 09:00:55 crc kubenswrapper[4788]: E0219 09:00:55.623851 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerName="extract-content" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.623866 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerName="extract-content" Feb 19 09:00:55 crc kubenswrapper[4788]: E0219 09:00:55.623883 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerName="extract-utilities" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.623889 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerName="extract-utilities" Feb 19 09:00:55 crc kubenswrapper[4788]: E0219 09:00:55.623899 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ee4096-6b61-438e-a3e1-ba9e720abd80" containerName="dnsmasq-dns" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.623905 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ee4096-6b61-438e-a3e1-ba9e720abd80" containerName="dnsmasq-dns" Feb 19 09:00:55 crc kubenswrapper[4788]: E0219 09:00:55.623913 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e0aea8-b2f5-42f4-ab90-77423a7832ce" containerName="mariadb-database-create" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.623920 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e0aea8-b2f5-42f4-ab90-77423a7832ce" containerName="mariadb-database-create" Feb 19 09:00:55 crc kubenswrapper[4788]: E0219 09:00:55.623939 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ee4096-6b61-438e-a3e1-ba9e720abd80" containerName="init" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.623951 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ee4096-6b61-438e-a3e1-ba9e720abd80" containerName="init" Feb 19 09:00:55 crc kubenswrapper[4788]: E0219 09:00:55.623970 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerName="registry-server" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.623977 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerName="registry-server" Feb 19 09:00:55 crc kubenswrapper[4788]: E0219 09:00:55.623991 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73420a48-bf28-40cc-b232-fab14ef5745e" containerName="mariadb-account-create-update" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.623997 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="73420a48-bf28-40cc-b232-fab14ef5745e" containerName="mariadb-account-create-update" Feb 19 09:00:55 crc kubenswrapper[4788]: E0219 09:00:55.624009 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66" containerName="mariadb-account-create-update" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.624016 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66" containerName="mariadb-account-create-update" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.624177 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="73420a48-bf28-40cc-b232-fab14ef5745e" containerName="mariadb-account-create-update" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.624189 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ee4096-6b61-438e-a3e1-ba9e720abd80" containerName="dnsmasq-dns" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.624196 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e0aea8-b2f5-42f4-ab90-77423a7832ce" containerName="mariadb-database-create" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.624208 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f5fb48-2371-47b0-be72-9521d37955bc" containerName="registry-server" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.624215 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66" containerName="mariadb-account-create-update" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.624831 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.629959 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9pt5q" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.631852 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jq4z9"] Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.639424 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.759021 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-db-sync-config-data\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.759424 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-combined-ca-bundle\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.759482 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-config-data\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.759503 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7mr\" (UniqueName: \"kubernetes.io/projected/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-kube-api-access-jb7mr\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.862171 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-combined-ca-bundle\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.862497 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-config-data\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.862583 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7mr\" (UniqueName: \"kubernetes.io/projected/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-kube-api-access-jb7mr\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.862693 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-db-sync-config-data\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.869978 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-db-sync-config-data\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.870061 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-combined-ca-bundle\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.870805 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-config-data\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.882510 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7mr\" (UniqueName: \"kubernetes.io/projected/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-kube-api-access-jb7mr\") pod \"glance-db-sync-jq4z9\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:55 crc kubenswrapper[4788]: I0219 09:00:55.940466 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jq4z9" Feb 19 09:00:56 crc kubenswrapper[4788]: I0219 09:00:56.587422 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jq4z9"] Feb 19 09:00:56 crc kubenswrapper[4788]: I0219 09:00:56.733572 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:56 crc kubenswrapper[4788]: I0219 09:00:56.785323 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:57 crc kubenswrapper[4788]: I0219 09:00:57.073944 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jq4z9" event={"ID":"bf1f8c5a-64ff-4e33-a3d0-409d025d567b","Type":"ContainerStarted","Data":"7b5c3e7e23c6cd4818ea318ddb44f2cc0481a95d3bc0bf5abb69aff628b2c1da"} Feb 19 09:00:57 crc kubenswrapper[4788]: I0219 09:00:57.373592 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 19 09:00:57 crc kubenswrapper[4788]: I0219 09:00:57.703291 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc4p7"] Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.081040 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nc4p7" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="registry-server" containerID="cri-o://ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645" gracePeriod=2 Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.413372 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k952c"] Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.428397 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k952c"] Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.573691 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.721586 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhzf2\" (UniqueName: \"kubernetes.io/projected/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-kube-api-access-fhzf2\") pod \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.721691 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-catalog-content\") pod \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.721731 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-utilities\") pod \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\" (UID: \"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3\") " Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.724936 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-utilities" (OuterVolumeSpecName: "utilities") pod "c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" (UID: "c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.740182 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-kube-api-access-fhzf2" (OuterVolumeSpecName: "kube-api-access-fhzf2") pod "c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" (UID: "c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3"). InnerVolumeSpecName "kube-api-access-fhzf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.757418 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66" path="/var/lib/kubelet/pods/dbcb1ba9-aaef-4fcf-9caa-e9570e97ea66/volumes" Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.824619 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhzf2\" (UniqueName: \"kubernetes.io/projected/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-kube-api-access-fhzf2\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.824659 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.870558 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" (UID: "c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:00:58 crc kubenswrapper[4788]: I0219 09:00:58.927644 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.091128 4788 generic.go:334] "Generic (PLEG): container finished" podID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerID="ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645" exitCode=0 Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.091214 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4p7" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.091217 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4p7" event={"ID":"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3","Type":"ContainerDied","Data":"ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645"} Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.091813 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4p7" event={"ID":"c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3","Type":"ContainerDied","Data":"a4b9b6ff31af3047adf90ec5aad004b52a3973cb641bd1492ddaa97ddedaadf2"} Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.091881 4788 scope.go:117] "RemoveContainer" containerID="ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.132101 4788 scope.go:117] "RemoveContainer" containerID="731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.138128 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc4p7"] Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.145339 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nc4p7"] Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.149945 4788 scope.go:117] "RemoveContainer" containerID="2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.188179 4788 scope.go:117] "RemoveContainer" containerID="ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645" Feb 19 09:00:59 crc kubenswrapper[4788]: E0219 09:00:59.188724 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645\": container with ID starting with ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645 not found: ID does not exist" containerID="ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.188764 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645"} err="failed to get container status \"ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645\": rpc error: code = NotFound desc = could not find container \"ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645\": container with ID starting with ad6435bed59e79fcbda274de6c23fb807729cd33e18d344985968db520bdd645 not found: ID does not exist" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.188792 4788 scope.go:117] "RemoveContainer" containerID="731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb" Feb 19 09:00:59 crc kubenswrapper[4788]: E0219 09:00:59.189022 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb\": container with ID starting with 731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb not found: ID does not exist" containerID="731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.189060 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb"} err="failed to get container status \"731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb\": rpc error: code = NotFound desc = could not find container \"731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb\": container with ID starting with 731021a7d61498b65bca623da09541c0e98293c900c2377ed4cea2fee46bbddb not found: ID does not exist" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.189079 4788 scope.go:117] "RemoveContainer" containerID="2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147" Feb 19 09:00:59 crc kubenswrapper[4788]: E0219 09:00:59.189299 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147\": container with ID starting with 2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147 not found: ID does not exist" containerID="2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.189384 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147"} err="failed to get container status \"2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147\": rpc error: code = NotFound desc = could not find container \"2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147\": container with ID starting with 2646a3a265de95b1955be23f3f08e09427fc77b6b2d889bc0f1d760307603147 not found: ID does not exist" Feb 19 09:00:59 crc kubenswrapper[4788]: I0219 09:00:59.234305 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:00:59 crc kubenswrapper[4788]: E0219 09:00:59.234492 4788 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 19 09:00:59 crc kubenswrapper[4788]: E0219 09:00:59.234676 4788 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 19 09:00:59 crc kubenswrapper[4788]: E0219 09:00:59.234754 4788 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift podName:f91d0357-4651-41b6-a842-27d8c7f47e60 nodeName:}" failed. No retries permitted until 2026-02-19 09:01:15.23471375 +0000 UTC m=+977.222725222 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift") pod "swift-storage-0" (UID: "f91d0357-4651-41b6-a842-27d8c7f47e60") : configmap "swift-ring-files" not found Feb 19 09:01:00 crc kubenswrapper[4788]: I0219 09:01:00.724323 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" path="/var/lib/kubelet/pods/c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3/volumes" Feb 19 09:01:01 crc kubenswrapper[4788]: I0219 09:01:01.109497 4788 generic.go:334] "Generic (PLEG): container finished" podID="7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" containerID="980e7ed169d0e9890bf637663263e4e864a0ae3a412e6bb408a0e96c94406d57" exitCode=0 Feb 19 09:01:01 crc kubenswrapper[4788]: I0219 09:01:01.109544 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tc7kz" event={"ID":"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a","Type":"ContainerDied","Data":"980e7ed169d0e9890bf637663263e4e864a0ae3a412e6bb408a0e96c94406d57"} Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.461438 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2qxmb"] Feb 19 09:01:03 crc kubenswrapper[4788]: E0219 09:01:03.463964 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="extract-utilities" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.463995 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="extract-utilities" Feb 19 09:01:03 crc kubenswrapper[4788]: E0219 09:01:03.464066 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="registry-server" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.464080 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="registry-server" Feb 19 09:01:03 crc kubenswrapper[4788]: E0219 09:01:03.464111 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="extract-content" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.464123 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="extract-content" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.464775 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4bdf43c-d4f9-49ac-a347-d1f9528bb4b3" containerName="registry-server" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.470715 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.473314 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2qxmb"] Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.475084 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.556818 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd64b1ce-8564-4921-b382-8adf535a61a4-operator-scripts\") pod \"root-account-create-update-2qxmb\" (UID: \"cd64b1ce-8564-4921-b382-8adf535a61a4\") " pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.557034 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glvjt\" (UniqueName: \"kubernetes.io/projected/cd64b1ce-8564-4921-b382-8adf535a61a4-kube-api-access-glvjt\") pod \"root-account-create-update-2qxmb\" (UID: \"cd64b1ce-8564-4921-b382-8adf535a61a4\") " pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.659002 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd64b1ce-8564-4921-b382-8adf535a61a4-operator-scripts\") pod \"root-account-create-update-2qxmb\" (UID: \"cd64b1ce-8564-4921-b382-8adf535a61a4\") " pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.659050 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glvjt\" (UniqueName: \"kubernetes.io/projected/cd64b1ce-8564-4921-b382-8adf535a61a4-kube-api-access-glvjt\") pod \"root-account-create-update-2qxmb\" (UID: \"cd64b1ce-8564-4921-b382-8adf535a61a4\") " pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.660005 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd64b1ce-8564-4921-b382-8adf535a61a4-operator-scripts\") pod \"root-account-create-update-2qxmb\" (UID: \"cd64b1ce-8564-4921-b382-8adf535a61a4\") " pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.700075 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glvjt\" (UniqueName: \"kubernetes.io/projected/cd64b1ce-8564-4921-b382-8adf535a61a4-kube-api-access-glvjt\") pod \"root-account-create-update-2qxmb\" (UID: \"cd64b1ce-8564-4921-b382-8adf535a61a4\") " pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:03 crc kubenswrapper[4788]: I0219 09:01:03.857978 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:04 crc kubenswrapper[4788]: I0219 09:01:04.131134 4788 generic.go:334] "Generic (PLEG): container finished" podID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerID="60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5" exitCode=0 Feb 19 09:01:04 crc kubenswrapper[4788]: I0219 09:01:04.131276 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb0abe11-b278-4a3a-aeda-3e08a603924b","Type":"ContainerDied","Data":"60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5"} Feb 19 09:01:04 crc kubenswrapper[4788]: I0219 09:01:04.133928 4788 generic.go:334] "Generic (PLEG): container finished" podID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerID="deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36" exitCode=0 Feb 19 09:01:04 crc kubenswrapper[4788]: I0219 09:01:04.133973 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad57631d-1772-49f0-ae6b-f16ee556e9c4","Type":"ContainerDied","Data":"deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36"} Feb 19 09:01:05 crc kubenswrapper[4788]: I0219 09:01:05.599831 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-msb2b" podUID="be836fd0-7c7e-4824-b455-bb4ccec1163e" containerName="ovn-controller" probeResult="failure" output=< Feb 19 09:01:05 crc kubenswrapper[4788]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 09:01:05 crc kubenswrapper[4788]: > Feb 19 09:01:05 crc kubenswrapper[4788]: I0219 09:01:05.649535 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:01:05 crc kubenswrapper[4788]: I0219 09:01:05.650747 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-snwhx" Feb 19 09:01:05 crc kubenswrapper[4788]: I0219 09:01:05.875055 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-msb2b-config-4m72q"] Feb 19 09:01:05 crc kubenswrapper[4788]: I0219 09:01:05.876231 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:05 crc kubenswrapper[4788]: I0219 09:01:05.879747 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 09:01:05 crc kubenswrapper[4788]: I0219 09:01:05.909190 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-msb2b-config-4m72q"] Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.005267 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p45mr\" (UniqueName: \"kubernetes.io/projected/f56e3a1b-d229-4eae-a010-604c46e41730-kube-api-access-p45mr\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.005327 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run-ovn\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.005393 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-scripts\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.005433 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-additional-scripts\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.005476 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.005506 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-log-ovn\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.106577 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-scripts\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.106639 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-additional-scripts\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.106678 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.106707 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-log-ovn\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.106762 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p45mr\" (UniqueName: \"kubernetes.io/projected/f56e3a1b-d229-4eae-a010-604c46e41730-kube-api-access-p45mr\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.106788 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run-ovn\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.107014 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run-ovn\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.107022 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.107042 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-log-ovn\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.107495 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-additional-scripts\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.108736 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-scripts\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.134452 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p45mr\" (UniqueName: \"kubernetes.io/projected/f56e3a1b-d229-4eae-a010-604c46e41730-kube-api-access-p45mr\") pod \"ovn-controller-msb2b-config-4m72q\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:06 crc kubenswrapper[4788]: I0219 09:01:06.209436 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.060264 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s56xm"] Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.068318 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s56xm"] Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.068433 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.128095 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-utilities\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.128200 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-catalog-content\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.128270 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqv2r\" (UniqueName: \"kubernetes.io/projected/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-kube-api-access-gqv2r\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.230300 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-catalog-content\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.230407 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqv2r\" (UniqueName: \"kubernetes.io/projected/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-kube-api-access-gqv2r\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.230528 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-utilities\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.230996 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-utilities\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.231225 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-catalog-content\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.256613 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqv2r\" (UniqueName: \"kubernetes.io/projected/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-kube-api-access-gqv2r\") pod \"redhat-marketplace-s56xm\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:07 crc kubenswrapper[4788]: I0219 09:01:07.425228 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.841760 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.967888 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-swiftconf\") pod \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.967951 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-ring-data-devices\") pod \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.967977 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-etc-swift\") pod \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.968102 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm46r\" (UniqueName: \"kubernetes.io/projected/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-kube-api-access-mm46r\") pod \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.969172 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-dispersionconf\") pod \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.969205 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-scripts\") pod \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.969233 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-combined-ca-bundle\") pod \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\" (UID: \"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a\") " Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.968941 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" (UID: "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.969134 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" (UID: "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.981697 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-kube-api-access-mm46r" (OuterVolumeSpecName: "kube-api-access-mm46r") pod "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" (UID: "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a"). InnerVolumeSpecName "kube-api-access-mm46r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.991387 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" (UID: "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.995682 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-scripts" (OuterVolumeSpecName: "scripts") pod "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" (UID: "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.999405 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" (UID: "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:09 crc kubenswrapper[4788]: I0219 09:01:09.999514 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" (UID: "7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.071671 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm46r\" (UniqueName: \"kubernetes.io/projected/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-kube-api-access-mm46r\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.072012 4788 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.072025 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.072034 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.072044 4788 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.072055 4788 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.072064 4788 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.191125 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb0abe11-b278-4a3a-aeda-3e08a603924b","Type":"ContainerStarted","Data":"93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea"} Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.191540 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.195894 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad57631d-1772-49f0-ae6b-f16ee556e9c4","Type":"ContainerStarted","Data":"4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52"} Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.196084 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.197644 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-tc7kz" event={"ID":"7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a","Type":"ContainerDied","Data":"7f4021678ef13b6ba2637d33cad4359c32fe5be7746644ae020b9e833eb0264f"} Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.197692 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4021678ef13b6ba2637d33cad4359c32fe5be7746644ae020b9e833eb0264f" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.197758 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-tc7kz" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.223353 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.996535435 podStartE2EDuration="1m5.223334392s" podCreationTimestamp="2026-02-19 09:00:05 +0000 UTC" firstStartedPulling="2026-02-19 09:00:21.305756039 +0000 UTC m=+923.293767511" lastFinishedPulling="2026-02-19 09:00:29.532554976 +0000 UTC m=+931.520566468" observedRunningTime="2026-02-19 09:01:10.217792518 +0000 UTC m=+972.205803990" watchObservedRunningTime="2026-02-19 09:01:10.223334392 +0000 UTC m=+972.211345864" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.246283 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2qxmb"] Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.252920 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=57.011991901 podStartE2EDuration="1m5.252895985s" podCreationTimestamp="2026-02-19 09:00:05 +0000 UTC" firstStartedPulling="2026-02-19 09:00:20.841898497 +0000 UTC m=+922.829909969" lastFinishedPulling="2026-02-19 09:00:29.082802571 +0000 UTC m=+931.070814053" observedRunningTime="2026-02-19 09:01:10.24552606 +0000 UTC m=+972.233537552" watchObservedRunningTime="2026-02-19 09:01:10.252895985 +0000 UTC m=+972.240907457" Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.335102 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-msb2b-config-4m72q"] Feb 19 09:01:10 crc kubenswrapper[4788]: W0219 09:01:10.339414 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf56e3a1b_d229_4eae_a010_604c46e41730.slice/crio-291f85e8769a0fc7b1a432810f6c66eb9e1297392020e2be0acfdfcafbe581c5 WatchSource:0}: Error finding container 291f85e8769a0fc7b1a432810f6c66eb9e1297392020e2be0acfdfcafbe581c5: Status 404 returned error can't find the container with id 291f85e8769a0fc7b1a432810f6c66eb9e1297392020e2be0acfdfcafbe581c5 Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.358390 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s56xm"] Feb 19 09:01:10 crc kubenswrapper[4788]: I0219 09:01:10.593952 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-msb2b" podUID="be836fd0-7c7e-4824-b455-bb4ccec1163e" containerName="ovn-controller" probeResult="failure" output=< Feb 19 09:01:10 crc kubenswrapper[4788]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 19 09:01:10 crc kubenswrapper[4788]: > Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.204903 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jq4z9" event={"ID":"bf1f8c5a-64ff-4e33-a3d0-409d025d567b","Type":"ContainerStarted","Data":"1afe1fa0b2fdfb5e25d59edf4360695744b01c8e3bcbc4bc0ea9e750f57884eb"} Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.206160 4788 generic.go:334] "Generic (PLEG): container finished" podID="cd64b1ce-8564-4921-b382-8adf535a61a4" containerID="3f8e6b280343f0d0df0929ee8129368e714260c596f952098114b71dc2a4f655" exitCode=0 Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.206210 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2qxmb" event={"ID":"cd64b1ce-8564-4921-b382-8adf535a61a4","Type":"ContainerDied","Data":"3f8e6b280343f0d0df0929ee8129368e714260c596f952098114b71dc2a4f655"} Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.206259 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2qxmb" event={"ID":"cd64b1ce-8564-4921-b382-8adf535a61a4","Type":"ContainerStarted","Data":"7a3f7a7da89b5d0bb8d48b7292c60a64b419aa4d413ff2240956ed19a6b5dc4d"} Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.207363 4788 generic.go:334] "Generic (PLEG): container finished" podID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerID="b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362" exitCode=0 Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.207412 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s56xm" event={"ID":"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d","Type":"ContainerDied","Data":"b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362"} Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.207431 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s56xm" event={"ID":"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d","Type":"ContainerStarted","Data":"86b6869b4513388616b1209298d32caf7b3b539c78614cd2b5db83ac36228f2c"} Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.209697 4788 generic.go:334] "Generic (PLEG): container finished" podID="f56e3a1b-d229-4eae-a010-604c46e41730" containerID="a4cf7ab5f1bc3368868977237f9ba9aa59efe583804eaad15f3258d392e9ac5d" exitCode=0 Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.209802 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-msb2b-config-4m72q" event={"ID":"f56e3a1b-d229-4eae-a010-604c46e41730","Type":"ContainerDied","Data":"a4cf7ab5f1bc3368868977237f9ba9aa59efe583804eaad15f3258d392e9ac5d"} Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.209830 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-msb2b-config-4m72q" event={"ID":"f56e3a1b-d229-4eae-a010-604c46e41730","Type":"ContainerStarted","Data":"291f85e8769a0fc7b1a432810f6c66eb9e1297392020e2be0acfdfcafbe581c5"} Feb 19 09:01:11 crc kubenswrapper[4788]: I0219 09:01:11.222681 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jq4z9" podStartSLOduration=3.0223569 podStartE2EDuration="16.222665129s" podCreationTimestamp="2026-02-19 09:00:55 +0000 UTC" firstStartedPulling="2026-02-19 09:00:56.605859322 +0000 UTC m=+958.593870794" lastFinishedPulling="2026-02-19 09:01:09.806167511 +0000 UTC m=+971.794179023" observedRunningTime="2026-02-19 09:01:11.222280158 +0000 UTC m=+973.210291630" watchObservedRunningTime="2026-02-19 09:01:11.222665129 +0000 UTC m=+973.210676601" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.218578 4788 generic.go:334] "Generic (PLEG): container finished" podID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerID="5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0" exitCode=0 Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.219900 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s56xm" event={"ID":"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d","Type":"ContainerDied","Data":"5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0"} Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.667847 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.674929 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816613 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-log-ovn\") pod \"f56e3a1b-d229-4eae-a010-604c46e41730\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816669 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p45mr\" (UniqueName: \"kubernetes.io/projected/f56e3a1b-d229-4eae-a010-604c46e41730-kube-api-access-p45mr\") pod \"f56e3a1b-d229-4eae-a010-604c46e41730\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816690 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run\") pod \"f56e3a1b-d229-4eae-a010-604c46e41730\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816710 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f56e3a1b-d229-4eae-a010-604c46e41730" (UID: "f56e3a1b-d229-4eae-a010-604c46e41730"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816770 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd64b1ce-8564-4921-b382-8adf535a61a4-operator-scripts\") pod \"cd64b1ce-8564-4921-b382-8adf535a61a4\" (UID: \"cd64b1ce-8564-4921-b382-8adf535a61a4\") " Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816806 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-additional-scripts\") pod \"f56e3a1b-d229-4eae-a010-604c46e41730\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816822 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run-ovn\") pod \"f56e3a1b-d229-4eae-a010-604c46e41730\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816853 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-scripts\") pod \"f56e3a1b-d229-4eae-a010-604c46e41730\" (UID: \"f56e3a1b-d229-4eae-a010-604c46e41730\") " Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.816913 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glvjt\" (UniqueName: \"kubernetes.io/projected/cd64b1ce-8564-4921-b382-8adf535a61a4-kube-api-access-glvjt\") pod \"cd64b1ce-8564-4921-b382-8adf535a61a4\" (UID: \"cd64b1ce-8564-4921-b382-8adf535a61a4\") " Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.817053 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run" (OuterVolumeSpecName: "var-run") pod "f56e3a1b-d229-4eae-a010-604c46e41730" (UID: "f56e3a1b-d229-4eae-a010-604c46e41730"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.817071 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f56e3a1b-d229-4eae-a010-604c46e41730" (UID: "f56e3a1b-d229-4eae-a010-604c46e41730"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.817683 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd64b1ce-8564-4921-b382-8adf535a61a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd64b1ce-8564-4921-b382-8adf535a61a4" (UID: "cd64b1ce-8564-4921-b382-8adf535a61a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.817968 4788 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.817986 4788 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.817996 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd64b1ce-8564-4921-b382-8adf535a61a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.818005 4788 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f56e3a1b-d229-4eae-a010-604c46e41730-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.818265 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-scripts" (OuterVolumeSpecName: "scripts") pod "f56e3a1b-d229-4eae-a010-604c46e41730" (UID: "f56e3a1b-d229-4eae-a010-604c46e41730"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.818724 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f56e3a1b-d229-4eae-a010-604c46e41730" (UID: "f56e3a1b-d229-4eae-a010-604c46e41730"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.822626 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f56e3a1b-d229-4eae-a010-604c46e41730-kube-api-access-p45mr" (OuterVolumeSpecName: "kube-api-access-p45mr") pod "f56e3a1b-d229-4eae-a010-604c46e41730" (UID: "f56e3a1b-d229-4eae-a010-604c46e41730"). InnerVolumeSpecName "kube-api-access-p45mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.823060 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd64b1ce-8564-4921-b382-8adf535a61a4-kube-api-access-glvjt" (OuterVolumeSpecName: "kube-api-access-glvjt") pod "cd64b1ce-8564-4921-b382-8adf535a61a4" (UID: "cd64b1ce-8564-4921-b382-8adf535a61a4"). InnerVolumeSpecName "kube-api-access-glvjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.919762 4788 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.919797 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f56e3a1b-d229-4eae-a010-604c46e41730-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.919807 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glvjt\" (UniqueName: \"kubernetes.io/projected/cd64b1ce-8564-4921-b382-8adf535a61a4-kube-api-access-glvjt\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:12 crc kubenswrapper[4788]: I0219 09:01:12.919819 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p45mr\" (UniqueName: \"kubernetes.io/projected/f56e3a1b-d229-4eae-a010-604c46e41730-kube-api-access-p45mr\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.233501 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2qxmb" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.233510 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2qxmb" event={"ID":"cd64b1ce-8564-4921-b382-8adf535a61a4","Type":"ContainerDied","Data":"7a3f7a7da89b5d0bb8d48b7292c60a64b419aa4d413ff2240956ed19a6b5dc4d"} Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.235588 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a3f7a7da89b5d0bb8d48b7292c60a64b419aa4d413ff2240956ed19a6b5dc4d" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.236727 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s56xm" event={"ID":"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d","Type":"ContainerStarted","Data":"01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5"} Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.238518 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-msb2b-config-4m72q" event={"ID":"f56e3a1b-d229-4eae-a010-604c46e41730","Type":"ContainerDied","Data":"291f85e8769a0fc7b1a432810f6c66eb9e1297392020e2be0acfdfcafbe581c5"} Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.238559 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="291f85e8769a0fc7b1a432810f6c66eb9e1297392020e2be0acfdfcafbe581c5" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.238628 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b-config-4m72q" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.295157 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s56xm" podStartSLOduration=4.844371484 podStartE2EDuration="6.295141746s" podCreationTimestamp="2026-02-19 09:01:07 +0000 UTC" firstStartedPulling="2026-02-19 09:01:11.209080061 +0000 UTC m=+973.197091533" lastFinishedPulling="2026-02-19 09:01:12.659850303 +0000 UTC m=+974.647861795" observedRunningTime="2026-02-19 09:01:13.285732564 +0000 UTC m=+975.273744036" watchObservedRunningTime="2026-02-19 09:01:13.295141746 +0000 UTC m=+975.283153218" Feb 19 09:01:13 crc kubenswrapper[4788]: E0219 09:01:13.473371 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd64b1ce_8564_4921_b382_8adf535a61a4.slice/crio-7a3f7a7da89b5d0bb8d48b7292c60a64b419aa4d413ff2240956ed19a6b5dc4d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf56e3a1b_d229_4eae_a010_604c46e41730.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd64b1ce_8564_4921_b382_8adf535a61a4.slice\": RecentStats: unable to find data in memory cache]" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.794972 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-msb2b-config-4m72q"] Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.799183 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-msb2b-config-4m72q"] Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.939085 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-msb2b-config-njg4b"] Feb 19 09:01:13 crc kubenswrapper[4788]: E0219 09:01:13.939514 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56e3a1b-d229-4eae-a010-604c46e41730" containerName="ovn-config" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.939536 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56e3a1b-d229-4eae-a010-604c46e41730" containerName="ovn-config" Feb 19 09:01:13 crc kubenswrapper[4788]: E0219 09:01:13.939564 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" containerName="swift-ring-rebalance" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.939574 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" containerName="swift-ring-rebalance" Feb 19 09:01:13 crc kubenswrapper[4788]: E0219 09:01:13.939590 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd64b1ce-8564-4921-b382-8adf535a61a4" containerName="mariadb-account-create-update" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.939601 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd64b1ce-8564-4921-b382-8adf535a61a4" containerName="mariadb-account-create-update" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.939768 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd64b1ce-8564-4921-b382-8adf535a61a4" containerName="mariadb-account-create-update" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.939786 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a" containerName="swift-ring-rebalance" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.939807 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f56e3a1b-d229-4eae-a010-604c46e41730" containerName="ovn-config" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.940485 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.944855 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 19 09:01:13 crc kubenswrapper[4788]: I0219 09:01:13.952292 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-msb2b-config-njg4b"] Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.045389 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-log-ovn\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.045483 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run-ovn\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.045521 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.045562 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-additional-scripts\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.045579 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqwlq\" (UniqueName: \"kubernetes.io/projected/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-kube-api-access-xqwlq\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.045626 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-scripts\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.147568 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-log-ovn\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.147687 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run-ovn\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.147870 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-log-ovn\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.147877 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run-ovn\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.147903 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.147926 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.147931 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-additional-scripts\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.148012 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqwlq\" (UniqueName: \"kubernetes.io/projected/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-kube-api-access-xqwlq\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.148054 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-scripts\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.148786 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-additional-scripts\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.150178 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-scripts\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.167850 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqwlq\" (UniqueName: \"kubernetes.io/projected/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-kube-api-access-xqwlq\") pod \"ovn-controller-msb2b-config-njg4b\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.256402 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.723887 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f56e3a1b-d229-4eae-a010-604c46e41730" path="/var/lib/kubelet/pods/f56e3a1b-d229-4eae-a010-604c46e41730/volumes" Feb 19 09:01:14 crc kubenswrapper[4788]: W0219 09:01:14.768877 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd34f02_99ff_4d86_a5b1_4e8ec8d9e1f1.slice/crio-d6e9f25be34aded91e8d5e1615b7870c870c4563612672de0ec17aa7ab523daa WatchSource:0}: Error finding container d6e9f25be34aded91e8d5e1615b7870c870c4563612672de0ec17aa7ab523daa: Status 404 returned error can't find the container with id d6e9f25be34aded91e8d5e1615b7870c870c4563612672de0ec17aa7ab523daa Feb 19 09:01:14 crc kubenswrapper[4788]: I0219 09:01:14.795209 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-msb2b-config-njg4b"] Feb 19 09:01:15 crc kubenswrapper[4788]: I0219 09:01:15.254203 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-msb2b-config-njg4b" event={"ID":"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1","Type":"ContainerStarted","Data":"d6e9f25be34aded91e8d5e1615b7870c870c4563612672de0ec17aa7ab523daa"} Feb 19 09:01:15 crc kubenswrapper[4788]: I0219 09:01:15.279727 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:01:15 crc kubenswrapper[4788]: I0219 09:01:15.287613 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f91d0357-4651-41b6-a842-27d8c7f47e60-etc-swift\") pod \"swift-storage-0\" (UID: \"f91d0357-4651-41b6-a842-27d8c7f47e60\") " pod="openstack/swift-storage-0" Feb 19 09:01:15 crc kubenswrapper[4788]: I0219 09:01:15.303309 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 19 09:01:15 crc kubenswrapper[4788]: I0219 09:01:15.597360 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-msb2b" Feb 19 09:01:15 crc kubenswrapper[4788]: I0219 09:01:15.920584 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 19 09:01:15 crc kubenswrapper[4788]: W0219 09:01:15.930109 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf91d0357_4651_41b6_a842_27d8c7f47e60.slice/crio-bcbb552ca81d91e18edff2087f2caa34e7e7e5cabeba590cfbbd4dc794674071 WatchSource:0}: Error finding container bcbb552ca81d91e18edff2087f2caa34e7e7e5cabeba590cfbbd4dc794674071: Status 404 returned error can't find the container with id bcbb552ca81d91e18edff2087f2caa34e7e7e5cabeba590cfbbd4dc794674071 Feb 19 09:01:16 crc kubenswrapper[4788]: I0219 09:01:16.264627 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"bcbb552ca81d91e18edff2087f2caa34e7e7e5cabeba590cfbbd4dc794674071"} Feb 19 09:01:16 crc kubenswrapper[4788]: I0219 09:01:16.266662 4788 generic.go:334] "Generic (PLEG): container finished" podID="afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" containerID="ba56eec83a04e92d2d18ed54ea13d11542ec80e3175278394d07c2eead7cfc6d" exitCode=0 Feb 19 09:01:16 crc kubenswrapper[4788]: I0219 09:01:16.266712 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-msb2b-config-njg4b" event={"ID":"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1","Type":"ContainerDied","Data":"ba56eec83a04e92d2d18ed54ea13d11542ec80e3175278394d07c2eead7cfc6d"} Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.426353 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.427458 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.506554 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.616309 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737298 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-log-ovn\") pod \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737434 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-additional-scripts\") pod \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737447 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" (UID: "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737532 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run-ovn\") pod \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737591 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run\") pod \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737650 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqwlq\" (UniqueName: \"kubernetes.io/projected/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-kube-api-access-xqwlq\") pod \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737659 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" (UID: "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737722 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-scripts\") pod \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\" (UID: \"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1\") " Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.737702 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run" (OuterVolumeSpecName: "var-run") pod "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" (UID: "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.738314 4788 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.738344 4788 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-run\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.738356 4788 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.738320 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" (UID: "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.738910 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-scripts" (OuterVolumeSpecName: "scripts") pod "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" (UID: "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.744709 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-kube-api-access-xqwlq" (OuterVolumeSpecName: "kube-api-access-xqwlq") pod "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" (UID: "afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1"). InnerVolumeSpecName "kube-api-access-xqwlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.840237 4788 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.840308 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqwlq\" (UniqueName: \"kubernetes.io/projected/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-kube-api-access-xqwlq\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:17 crc kubenswrapper[4788]: I0219 09:01:17.840327 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.302173 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"287e663380a982bee44f5578b9d2d6af11f74bd7f75263f8cd41b2e091eed0b3"} Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.302492 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"5a37a9cb17ace442ae528a7dcb30229bb1b99663345d68e480bed45bb0e0fa8e"} Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.302882 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"5511a874edd7879724fc2cb4907abdf087e9e3765d7c010d5d2fcfb105b131a4"} Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.302908 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"cfec204157a41423f52cdcf815ac1793d6347766a299804c8616c30f7cb418da"} Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.305493 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-msb2b-config-njg4b" Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.308333 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-msb2b-config-njg4b" event={"ID":"afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1","Type":"ContainerDied","Data":"d6e9f25be34aded91e8d5e1615b7870c870c4563612672de0ec17aa7ab523daa"} Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.308383 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e9f25be34aded91e8d5e1615b7870c870c4563612672de0ec17aa7ab523daa" Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.377825 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.456345 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s56xm"] Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.697914 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-msb2b-config-njg4b"] Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.706295 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-msb2b-config-njg4b"] Feb 19 09:01:18 crc kubenswrapper[4788]: I0219 09:01:18.724511 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" path="/var/lib/kubelet/pods/afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1/volumes" Feb 19 09:01:19 crc kubenswrapper[4788]: I0219 09:01:19.338576 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"8f64e55a290b1b44c7b87bc5fb3e69b50d865e9559e440d56ad3b67ca2494b04"} Feb 19 09:01:20 crc kubenswrapper[4788]: I0219 09:01:20.348188 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"3c1cb92344a9f5e81f9226ac54c9419182ed9b72ccfa33e31e8bd06227f39448"} Feb 19 09:01:20 crc kubenswrapper[4788]: I0219 09:01:20.348539 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"ce9bad0670521b069ebe6ab1df6732c2e9ce3eaf1c7ecc1e7ebe6acc78468074"} Feb 19 09:01:20 crc kubenswrapper[4788]: I0219 09:01:20.348551 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"216c469463322789f0eadedaa6026648a4f4d1c08f49c92623a07d0bf0d838f8"} Feb 19 09:01:20 crc kubenswrapper[4788]: I0219 09:01:20.350117 4788 generic.go:334] "Generic (PLEG): container finished" podID="bf1f8c5a-64ff-4e33-a3d0-409d025d567b" containerID="1afe1fa0b2fdfb5e25d59edf4360695744b01c8e3bcbc4bc0ea9e750f57884eb" exitCode=0 Feb 19 09:01:20 crc kubenswrapper[4788]: I0219 09:01:20.350192 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jq4z9" event={"ID":"bf1f8c5a-64ff-4e33-a3d0-409d025d567b","Type":"ContainerDied","Data":"1afe1fa0b2fdfb5e25d59edf4360695744b01c8e3bcbc4bc0ea9e750f57884eb"} Feb 19 09:01:20 crc kubenswrapper[4788]: I0219 09:01:20.350318 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s56xm" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerName="registry-server" containerID="cri-o://01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5" gracePeriod=2 Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.306717 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.366078 4788 generic.go:334] "Generic (PLEG): container finished" podID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerID="01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5" exitCode=0 Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.366146 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s56xm" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.366164 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s56xm" event={"ID":"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d","Type":"ContainerDied","Data":"01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5"} Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.366215 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s56xm" event={"ID":"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d","Type":"ContainerDied","Data":"86b6869b4513388616b1209298d32caf7b3b539c78614cd2b5db83ac36228f2c"} Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.366236 4788 scope.go:117] "RemoveContainer" containerID="01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.380274 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"f6023ed09798cb5a8fe5c2a728918b09e64ed169dd301d3a8361b5d6e6b8a018"} Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.394592 4788 scope.go:117] "RemoveContainer" containerID="5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.401486 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-catalog-content\") pod \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.401579 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-utilities\") pod \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.401726 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqv2r\" (UniqueName: \"kubernetes.io/projected/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-kube-api-access-gqv2r\") pod \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\" (UID: \"7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d\") " Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.402637 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-utilities" (OuterVolumeSpecName: "utilities") pod "7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" (UID: "7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.405719 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-kube-api-access-gqv2r" (OuterVolumeSpecName: "kube-api-access-gqv2r") pod "7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" (UID: "7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d"). InnerVolumeSpecName "kube-api-access-gqv2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.430690 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" (UID: "7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.498288 4788 scope.go:117] "RemoveContainer" containerID="b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.503370 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.503396 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqv2r\" (UniqueName: \"kubernetes.io/projected/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-kube-api-access-gqv2r\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.503408 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.516317 4788 scope.go:117] "RemoveContainer" containerID="01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5" Feb 19 09:01:21 crc kubenswrapper[4788]: E0219 09:01:21.516870 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5\": container with ID starting with 01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5 not found: ID does not exist" containerID="01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.516897 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5"} err="failed to get container status \"01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5\": rpc error: code = NotFound desc = could not find container \"01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5\": container with ID starting with 01cdcbea7809cf4061a75be63da122a3ed59ca07cc5e8901e08ac8d25c55d6e5 not found: ID does not exist" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.516920 4788 scope.go:117] "RemoveContainer" containerID="5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0" Feb 19 09:01:21 crc kubenswrapper[4788]: E0219 09:01:21.517389 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0\": container with ID starting with 5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0 not found: ID does not exist" containerID="5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.517412 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0"} err="failed to get container status \"5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0\": rpc error: code = NotFound desc = could not find container \"5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0\": container with ID starting with 5cfce49ed7b9cb3d3d14608adba49cac0c254e2191f1c3e712cbac84fc0988c0 not found: ID does not exist" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.517426 4788 scope.go:117] "RemoveContainer" containerID="b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362" Feb 19 09:01:21 crc kubenswrapper[4788]: E0219 09:01:21.517776 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362\": container with ID starting with b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362 not found: ID does not exist" containerID="b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.517809 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362"} err="failed to get container status \"b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362\": rpc error: code = NotFound desc = could not find container \"b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362\": container with ID starting with b4c4368e4d1ec48490a61595de5761bc7295f2177cb98e3d339bb6ba2adca362 not found: ID does not exist" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.712178 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s56xm"] Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.721203 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s56xm"] Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.733860 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jq4z9" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.812905 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-config-data\") pod \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.812968 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb7mr\" (UniqueName: \"kubernetes.io/projected/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-kube-api-access-jb7mr\") pod \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.813013 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-db-sync-config-data\") pod \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.813039 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-combined-ca-bundle\") pod \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\" (UID: \"bf1f8c5a-64ff-4e33-a3d0-409d025d567b\") " Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.816568 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bf1f8c5a-64ff-4e33-a3d0-409d025d567b" (UID: "bf1f8c5a-64ff-4e33-a3d0-409d025d567b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.817349 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-kube-api-access-jb7mr" (OuterVolumeSpecName: "kube-api-access-jb7mr") pod "bf1f8c5a-64ff-4e33-a3d0-409d025d567b" (UID: "bf1f8c5a-64ff-4e33-a3d0-409d025d567b"). InnerVolumeSpecName "kube-api-access-jb7mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.832414 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf1f8c5a-64ff-4e33-a3d0-409d025d567b" (UID: "bf1f8c5a-64ff-4e33-a3d0-409d025d567b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.852243 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-config-data" (OuterVolumeSpecName: "config-data") pod "bf1f8c5a-64ff-4e33-a3d0-409d025d567b" (UID: "bf1f8c5a-64ff-4e33-a3d0-409d025d567b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.915714 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.915769 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb7mr\" (UniqueName: \"kubernetes.io/projected/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-kube-api-access-jb7mr\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.915785 4788 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:21 crc kubenswrapper[4788]: I0219 09:01:21.915799 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf1f8c5a-64ff-4e33-a3d0-409d025d567b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.397665 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"96309031feb756c801e940f9d15ff122768de8ca048e0c7964a7564a771f64a6"} Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.398025 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"b383aff4c6df397ba18f77991376fc1158245bc8726bf2b94e6cb6fff158692b"} Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.398043 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"b6d88d8e8ede7ae3e88307c4ef16df9b2d4485531151c034f7dd292c6b4768f8"} Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.398056 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"2fdf27a080cf21b9995d640982c61bd181a0fbef9bc1a48512be05f0ac56464c"} Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.398069 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"f952901142bc89bbba4df649207960a7c4a2ca10b76c3fcb795a26380dc0b0a7"} Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.398079 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"f91d0357-4651-41b6-a842-27d8c7f47e60","Type":"ContainerStarted","Data":"fb861fc604136ce3ad2486c357414a3f4ad2e16408572e2f1276518cef69ffa7"} Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.401570 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jq4z9" event={"ID":"bf1f8c5a-64ff-4e33-a3d0-409d025d567b","Type":"ContainerDied","Data":"7b5c3e7e23c6cd4818ea318ddb44f2cc0481a95d3bc0bf5abb69aff628b2c1da"} Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.401621 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b5c3e7e23c6cd4818ea318ddb44f2cc0481a95d3bc0bf5abb69aff628b2c1da" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.401677 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jq4z9" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.437738 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.45691507 podStartE2EDuration="40.43771762s" podCreationTimestamp="2026-02-19 09:00:42 +0000 UTC" firstStartedPulling="2026-02-19 09:01:15.931998284 +0000 UTC m=+977.920009766" lastFinishedPulling="2026-02-19 09:01:20.912800844 +0000 UTC m=+982.900812316" observedRunningTime="2026-02-19 09:01:22.437439823 +0000 UTC m=+984.425451305" watchObservedRunningTime="2026-02-19 09:01:22.43771762 +0000 UTC m=+984.425729092" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.748683 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" path="/var/lib/kubelet/pods/7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d/volumes" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.770625 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-46jn5"] Feb 19 09:01:22 crc kubenswrapper[4788]: E0219 09:01:22.771331 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerName="extract-utilities" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.771344 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerName="extract-utilities" Feb 19 09:01:22 crc kubenswrapper[4788]: E0219 09:01:22.771372 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerName="registry-server" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.771378 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerName="registry-server" Feb 19 09:01:22 crc kubenswrapper[4788]: E0219 09:01:22.771398 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerName="extract-content" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.771404 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerName="extract-content" Feb 19 09:01:22 crc kubenswrapper[4788]: E0219 09:01:22.771413 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" containerName="ovn-config" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.771418 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" containerName="ovn-config" Feb 19 09:01:22 crc kubenswrapper[4788]: E0219 09:01:22.771448 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1f8c5a-64ff-4e33-a3d0-409d025d567b" containerName="glance-db-sync" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.771454 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1f8c5a-64ff-4e33-a3d0-409d025d567b" containerName="glance-db-sync" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.771796 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f29c976-4f5e-4a04-8dbe-ad3cf8da8a1d" containerName="registry-server" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.772033 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1f8c5a-64ff-4e33-a3d0-409d025d567b" containerName="glance-db-sync" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.772091 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd34f02-99ff-4d86-a5b1-4e8ec8d9e1f1" containerName="ovn-config" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.773828 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.811633 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-46jn5"] Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.840899 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-config\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.840948 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.840988 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.841034 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf9rr\" (UniqueName: \"kubernetes.io/projected/ded01393-d464-432f-9669-ab333513cd76-kube-api-access-bf9rr\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.841073 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.875682 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-46jn5"] Feb 19 09:01:22 crc kubenswrapper[4788]: E0219 09:01:22.876186 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-bf9rr ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" podUID="ded01393-d464-432f-9669-ab333513cd76" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.896331 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-jdnlc"] Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.897512 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.900831 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.908109 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-jdnlc"] Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942679 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942729 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6klg\" (UniqueName: \"kubernetes.io/projected/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-kube-api-access-q6klg\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942758 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942778 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942825 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf9rr\" (UniqueName: \"kubernetes.io/projected/ded01393-d464-432f-9669-ab333513cd76-kube-api-access-bf9rr\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942849 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942878 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942891 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942911 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942943 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-config\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.942968 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-config\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.943775 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-config\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.943934 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.944132 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.945005 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:22 crc kubenswrapper[4788]: I0219 09:01:22.962800 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf9rr\" (UniqueName: \"kubernetes.io/projected/ded01393-d464-432f-9669-ab333513cd76-kube-api-access-bf9rr\") pod \"dnsmasq-dns-5b946c75cc-46jn5\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.044201 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6klg\" (UniqueName: \"kubernetes.io/projected/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-kube-api-access-q6klg\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.044312 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.044486 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.044516 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.044570 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.044652 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-config\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.045511 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.045546 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.045523 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.045676 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.045691 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-config\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.068454 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6klg\" (UniqueName: \"kubernetes.io/projected/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-kube-api-access-q6klg\") pod \"dnsmasq-dns-74f6bcbc87-jdnlc\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.228890 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.408507 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.423400 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.554786 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-nb\") pod \"ded01393-d464-432f-9669-ab333513cd76\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.554855 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-sb\") pod \"ded01393-d464-432f-9669-ab333513cd76\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.554913 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-config\") pod \"ded01393-d464-432f-9669-ab333513cd76\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.554967 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-dns-svc\") pod \"ded01393-d464-432f-9669-ab333513cd76\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.555025 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf9rr\" (UniqueName: \"kubernetes.io/projected/ded01393-d464-432f-9669-ab333513cd76-kube-api-access-bf9rr\") pod \"ded01393-d464-432f-9669-ab333513cd76\" (UID: \"ded01393-d464-432f-9669-ab333513cd76\") " Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.555561 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ded01393-d464-432f-9669-ab333513cd76" (UID: "ded01393-d464-432f-9669-ab333513cd76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.555777 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-config" (OuterVolumeSpecName: "config") pod "ded01393-d464-432f-9669-ab333513cd76" (UID: "ded01393-d464-432f-9669-ab333513cd76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.555920 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ded01393-d464-432f-9669-ab333513cd76" (UID: "ded01393-d464-432f-9669-ab333513cd76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.556030 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ded01393-d464-432f-9669-ab333513cd76" (UID: "ded01393-d464-432f-9669-ab333513cd76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.556429 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.556453 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.556462 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.556472 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ded01393-d464-432f-9669-ab333513cd76-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.561302 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded01393-d464-432f-9669-ab333513cd76-kube-api-access-bf9rr" (OuterVolumeSpecName: "kube-api-access-bf9rr") pod "ded01393-d464-432f-9669-ab333513cd76" (UID: "ded01393-d464-432f-9669-ab333513cd76"). InnerVolumeSpecName "kube-api-access-bf9rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:23 crc kubenswrapper[4788]: W0219 09:01:23.640349 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6d6323_f486_46f4_86c4_2e69ad36b0ad.slice/crio-9d2177cef06c0cbadefa8ebba90ff822e6d788a5414dd46335c73b3abe8605fe WatchSource:0}: Error finding container 9d2177cef06c0cbadefa8ebba90ff822e6d788a5414dd46335c73b3abe8605fe: Status 404 returned error can't find the container with id 9d2177cef06c0cbadefa8ebba90ff822e6d788a5414dd46335c73b3abe8605fe Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.647992 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-jdnlc"] Feb 19 09:01:23 crc kubenswrapper[4788]: I0219 09:01:23.657568 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf9rr\" (UniqueName: \"kubernetes.io/projected/ded01393-d464-432f-9669-ab333513cd76-kube-api-access-bf9rr\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:24 crc kubenswrapper[4788]: I0219 09:01:24.416893 4788 generic.go:334] "Generic (PLEG): container finished" podID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerID="a1eb30227ff2aaa4f06e26503102b3e993fce23a450bb6311df3abbea081c9d9" exitCode=0 Feb 19 09:01:24 crc kubenswrapper[4788]: I0219 09:01:24.416976 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" event={"ID":"fe6d6323-f486-46f4-86c4-2e69ad36b0ad","Type":"ContainerDied","Data":"a1eb30227ff2aaa4f06e26503102b3e993fce23a450bb6311df3abbea081c9d9"} Feb 19 09:01:24 crc kubenswrapper[4788]: I0219 09:01:24.418135 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" event={"ID":"fe6d6323-f486-46f4-86c4-2e69ad36b0ad","Type":"ContainerStarted","Data":"9d2177cef06c0cbadefa8ebba90ff822e6d788a5414dd46335c73b3abe8605fe"} Feb 19 09:01:24 crc kubenswrapper[4788]: I0219 09:01:24.418175 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-46jn5" Feb 19 09:01:24 crc kubenswrapper[4788]: I0219 09:01:24.662838 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-46jn5"] Feb 19 09:01:24 crc kubenswrapper[4788]: I0219 09:01:24.671230 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-46jn5"] Feb 19 09:01:24 crc kubenswrapper[4788]: I0219 09:01:24.725244 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded01393-d464-432f-9669-ab333513cd76" path="/var/lib/kubelet/pods/ded01393-d464-432f-9669-ab333513cd76/volumes" Feb 19 09:01:25 crc kubenswrapper[4788]: I0219 09:01:25.427318 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" event={"ID":"fe6d6323-f486-46f4-86c4-2e69ad36b0ad","Type":"ContainerStarted","Data":"18867f1c93616ceb740fa72d23c4d9b44f08f9f4c708b15efaf4a883fef242a7"} Feb 19 09:01:25 crc kubenswrapper[4788]: I0219 09:01:25.428353 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:25 crc kubenswrapper[4788]: I0219 09:01:25.446975 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" podStartSLOduration=3.446959832 podStartE2EDuration="3.446959832s" podCreationTimestamp="2026-02-19 09:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:01:25.445140232 +0000 UTC m=+987.433151704" watchObservedRunningTime="2026-02-19 09:01:25.446959832 +0000 UTC m=+987.434971304" Feb 19 09:01:26 crc kubenswrapper[4788]: I0219 09:01:26.597408 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 09:01:26 crc kubenswrapper[4788]: I0219 09:01:26.852459 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.021973 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jdh5n"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.022900 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.052186 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jdh5n"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.099672 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-5w8kn"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.100626 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.117051 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw2n\" (UniqueName: \"kubernetes.io/projected/da103dab-8e46-466c-90db-c237910cc9e7-kube-api-access-2gw2n\") pod \"heat-db-create-5w8kn\" (UID: \"da103dab-8e46-466c-90db-c237910cc9e7\") " pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.117165 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f701c-d9c4-4157-bcf2-fe8875ce36e7-operator-scripts\") pod \"cinder-db-create-jdh5n\" (UID: \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\") " pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.117236 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtjm\" (UniqueName: \"kubernetes.io/projected/231f701c-d9c4-4157-bcf2-fe8875ce36e7-kube-api-access-rbtjm\") pod \"cinder-db-create-jdh5n\" (UID: \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\") " pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.117287 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da103dab-8e46-466c-90db-c237910cc9e7-operator-scripts\") pod \"heat-db-create-5w8kn\" (UID: \"da103dab-8e46-466c-90db-c237910cc9e7\") " pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.140418 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-5w8kn"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.220058 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw2n\" (UniqueName: \"kubernetes.io/projected/da103dab-8e46-466c-90db-c237910cc9e7-kube-api-access-2gw2n\") pod \"heat-db-create-5w8kn\" (UID: \"da103dab-8e46-466c-90db-c237910cc9e7\") " pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.220150 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f701c-d9c4-4157-bcf2-fe8875ce36e7-operator-scripts\") pod \"cinder-db-create-jdh5n\" (UID: \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\") " pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.220219 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtjm\" (UniqueName: \"kubernetes.io/projected/231f701c-d9c4-4157-bcf2-fe8875ce36e7-kube-api-access-rbtjm\") pod \"cinder-db-create-jdh5n\" (UID: \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\") " pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.220236 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da103dab-8e46-466c-90db-c237910cc9e7-operator-scripts\") pod \"heat-db-create-5w8kn\" (UID: \"da103dab-8e46-466c-90db-c237910cc9e7\") " pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.220992 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da103dab-8e46-466c-90db-c237910cc9e7-operator-scripts\") pod \"heat-db-create-5w8kn\" (UID: \"da103dab-8e46-466c-90db-c237910cc9e7\") " pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.221625 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f701c-d9c4-4157-bcf2-fe8875ce36e7-operator-scripts\") pod \"cinder-db-create-jdh5n\" (UID: \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\") " pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.259169 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtjm\" (UniqueName: \"kubernetes.io/projected/231f701c-d9c4-4157-bcf2-fe8875ce36e7-kube-api-access-rbtjm\") pod \"cinder-db-create-jdh5n\" (UID: \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\") " pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.263309 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fbfe-account-create-update-jmjqh"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.264280 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.266350 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw2n\" (UniqueName: \"kubernetes.io/projected/da103dab-8e46-466c-90db-c237910cc9e7-kube-api-access-2gw2n\") pod \"heat-db-create-5w8kn\" (UID: \"da103dab-8e46-466c-90db-c237910cc9e7\") " pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.266712 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.280363 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fbfe-account-create-update-jmjqh"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.322005 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldq45\" (UniqueName: \"kubernetes.io/projected/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-kube-api-access-ldq45\") pod \"cinder-fbfe-account-create-update-jmjqh\" (UID: \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\") " pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.322098 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-operator-scripts\") pod \"cinder-fbfe-account-create-update-jmjqh\" (UID: \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\") " pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.340113 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-07bb-account-create-update-kvcc2"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.341305 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.343175 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.353373 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.390832 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-07bb-account-create-update-kvcc2"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.424964 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-operator-scripts\") pod \"cinder-fbfe-account-create-update-jmjqh\" (UID: \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\") " pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.425368 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1223d0f-9cda-4590-9ae6-353c58886f99-operator-scripts\") pod \"heat-07bb-account-create-update-kvcc2\" (UID: \"d1223d0f-9cda-4590-9ae6-353c58886f99\") " pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.425476 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldq45\" (UniqueName: \"kubernetes.io/projected/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-kube-api-access-ldq45\") pod \"cinder-fbfe-account-create-update-jmjqh\" (UID: \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\") " pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.425501 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksk9m\" (UniqueName: \"kubernetes.io/projected/d1223d0f-9cda-4590-9ae6-353c58886f99-kube-api-access-ksk9m\") pod \"heat-07bb-account-create-update-kvcc2\" (UID: \"d1223d0f-9cda-4590-9ae6-353c58886f99\") " pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.426263 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-operator-scripts\") pod \"cinder-fbfe-account-create-update-jmjqh\" (UID: \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\") " pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.440399 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.448273 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldq45\" (UniqueName: \"kubernetes.io/projected/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-kube-api-access-ldq45\") pod \"cinder-fbfe-account-create-update-jmjqh\" (UID: \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\") " pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.535594 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1223d0f-9cda-4590-9ae6-353c58886f99-operator-scripts\") pod \"heat-07bb-account-create-update-kvcc2\" (UID: \"d1223d0f-9cda-4590-9ae6-353c58886f99\") " pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.536946 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1223d0f-9cda-4590-9ae6-353c58886f99-operator-scripts\") pod \"heat-07bb-account-create-update-kvcc2\" (UID: \"d1223d0f-9cda-4590-9ae6-353c58886f99\") " pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.536994 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksk9m\" (UniqueName: \"kubernetes.io/projected/d1223d0f-9cda-4590-9ae6-353c58886f99-kube-api-access-ksk9m\") pod \"heat-07bb-account-create-update-kvcc2\" (UID: \"d1223d0f-9cda-4590-9ae6-353c58886f99\") " pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.539238 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-d8ngr"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.540192 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.558706 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0317-account-create-update-p9xsz"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.559753 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.562289 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.564621 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d8ngr"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.573775 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksk9m\" (UniqueName: \"kubernetes.io/projected/d1223d0f-9cda-4590-9ae6-353c58886f99-kube-api-access-ksk9m\") pod \"heat-07bb-account-create-update-kvcc2\" (UID: \"d1223d0f-9cda-4590-9ae6-353c58886f99\") " pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.580122 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-zzzf4"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.581317 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.587945 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.588139 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rzps4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.588404 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.588600 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.621022 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0317-account-create-update-p9xsz"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.637870 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-combined-ca-bundle\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.637926 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgdr\" (UniqueName: \"kubernetes.io/projected/59cdc318-8f14-4606-aa81-1a16a1ed697b-kube-api-access-psgdr\") pod \"neutron-0317-account-create-update-p9xsz\" (UID: \"59cdc318-8f14-4606-aa81-1a16a1ed697b\") " pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.637960 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-config-data\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.638012 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6xtt\" (UniqueName: \"kubernetes.io/projected/7b9a6da4-e188-4741-b5a4-60a33b8cd415-kube-api-access-b6xtt\") pod \"neutron-db-create-d8ngr\" (UID: \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\") " pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.638093 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59cdc318-8f14-4606-aa81-1a16a1ed697b-operator-scripts\") pod \"neutron-0317-account-create-update-p9xsz\" (UID: \"59cdc318-8f14-4606-aa81-1a16a1ed697b\") " pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.638131 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b9a6da4-e188-4741-b5a4-60a33b8cd415-operator-scripts\") pod \"neutron-db-create-d8ngr\" (UID: \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\") " pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.638147 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh59s\" (UniqueName: \"kubernetes.io/projected/ccb2eae7-2f3d-424a-b805-b8452ceee91f-kube-api-access-qh59s\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.641321 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zzzf4"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.651322 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.666594 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.712488 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-brnnm"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.713504 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.721754 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b159-account-create-update-lk6tl"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.723329 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.725337 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.734375 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-brnnm"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.740947 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv5cx\" (UniqueName: \"kubernetes.io/projected/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-kube-api-access-xv5cx\") pod \"barbican-db-create-brnnm\" (UID: \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\") " pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.741038 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-combined-ca-bundle\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.741779 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgdr\" (UniqueName: \"kubernetes.io/projected/59cdc318-8f14-4606-aa81-1a16a1ed697b-kube-api-access-psgdr\") pod \"neutron-0317-account-create-update-p9xsz\" (UID: \"59cdc318-8f14-4606-aa81-1a16a1ed697b\") " pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.741870 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-config-data\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.741902 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6xtt\" (UniqueName: \"kubernetes.io/projected/7b9a6da4-e188-4741-b5a4-60a33b8cd415-kube-api-access-b6xtt\") pod \"neutron-db-create-d8ngr\" (UID: \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\") " pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.741994 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87gxf\" (UniqueName: \"kubernetes.io/projected/d07fa4d7-5916-4288-9b30-0413795f6a69-kube-api-access-87gxf\") pod \"barbican-b159-account-create-update-lk6tl\" (UID: \"d07fa4d7-5916-4288-9b30-0413795f6a69\") " pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.742019 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-operator-scripts\") pod \"barbican-db-create-brnnm\" (UID: \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\") " pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.742062 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59cdc318-8f14-4606-aa81-1a16a1ed697b-operator-scripts\") pod \"neutron-0317-account-create-update-p9xsz\" (UID: \"59cdc318-8f14-4606-aa81-1a16a1ed697b\") " pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.742626 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b9a6da4-e188-4741-b5a4-60a33b8cd415-operator-scripts\") pod \"neutron-db-create-d8ngr\" (UID: \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\") " pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.742696 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59cdc318-8f14-4606-aa81-1a16a1ed697b-operator-scripts\") pod \"neutron-0317-account-create-update-p9xsz\" (UID: \"59cdc318-8f14-4606-aa81-1a16a1ed697b\") " pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.742970 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh59s\" (UniqueName: \"kubernetes.io/projected/ccb2eae7-2f3d-424a-b805-b8452ceee91f-kube-api-access-qh59s\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.743383 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d07fa4d7-5916-4288-9b30-0413795f6a69-operator-scripts\") pod \"barbican-b159-account-create-update-lk6tl\" (UID: \"d07fa4d7-5916-4288-9b30-0413795f6a69\") " pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.743511 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b9a6da4-e188-4741-b5a4-60a33b8cd415-operator-scripts\") pod \"neutron-db-create-d8ngr\" (UID: \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\") " pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.750857 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-config-data\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.752863 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b159-account-create-update-lk6tl"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.752881 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-combined-ca-bundle\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.761404 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6xtt\" (UniqueName: \"kubernetes.io/projected/7b9a6da4-e188-4741-b5a4-60a33b8cd415-kube-api-access-b6xtt\") pod \"neutron-db-create-d8ngr\" (UID: \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\") " pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.761504 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh59s\" (UniqueName: \"kubernetes.io/projected/ccb2eae7-2f3d-424a-b805-b8452ceee91f-kube-api-access-qh59s\") pod \"keystone-db-sync-zzzf4\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.763657 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgdr\" (UniqueName: \"kubernetes.io/projected/59cdc318-8f14-4606-aa81-1a16a1ed697b-kube-api-access-psgdr\") pod \"neutron-0317-account-create-update-p9xsz\" (UID: \"59cdc318-8f14-4606-aa81-1a16a1ed697b\") " pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.844991 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87gxf\" (UniqueName: \"kubernetes.io/projected/d07fa4d7-5916-4288-9b30-0413795f6a69-kube-api-access-87gxf\") pod \"barbican-b159-account-create-update-lk6tl\" (UID: \"d07fa4d7-5916-4288-9b30-0413795f6a69\") " pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.845385 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-operator-scripts\") pod \"barbican-db-create-brnnm\" (UID: \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\") " pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.845446 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d07fa4d7-5916-4288-9b30-0413795f6a69-operator-scripts\") pod \"barbican-b159-account-create-update-lk6tl\" (UID: \"d07fa4d7-5916-4288-9b30-0413795f6a69\") " pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.845482 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv5cx\" (UniqueName: \"kubernetes.io/projected/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-kube-api-access-xv5cx\") pod \"barbican-db-create-brnnm\" (UID: \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\") " pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.846171 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-operator-scripts\") pod \"barbican-db-create-brnnm\" (UID: \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\") " pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.846175 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d07fa4d7-5916-4288-9b30-0413795f6a69-operator-scripts\") pod \"barbican-b159-account-create-update-lk6tl\" (UID: \"d07fa4d7-5916-4288-9b30-0413795f6a69\") " pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.862870 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv5cx\" (UniqueName: \"kubernetes.io/projected/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-kube-api-access-xv5cx\") pod \"barbican-db-create-brnnm\" (UID: \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\") " pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.863315 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87gxf\" (UniqueName: \"kubernetes.io/projected/d07fa4d7-5916-4288-9b30-0413795f6a69-kube-api-access-87gxf\") pod \"barbican-b159-account-create-update-lk6tl\" (UID: \"d07fa4d7-5916-4288-9b30-0413795f6a69\") " pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.891725 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.901647 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.907554 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jdh5n"] Feb 19 09:01:27 crc kubenswrapper[4788]: I0219 09:01:27.923016 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.018824 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-5w8kn"] Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.037388 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.051125 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.204875 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fbfe-account-create-update-jmjqh"] Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.332764 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-07bb-account-create-update-kvcc2"] Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.453277 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0317-account-create-update-p9xsz"] Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.503922 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fbfe-account-create-update-jmjqh" event={"ID":"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d","Type":"ContainerStarted","Data":"6037864e63a6ddf0e056b72e75b44a68dc6caeef20e718ab2db587eb9f6942f0"} Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.526868 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-5w8kn" event={"ID":"da103dab-8e46-466c-90db-c237910cc9e7","Type":"ContainerStarted","Data":"03c8e46258faa59402c50956ae0fdb7548577672ca54b8637f9ef9880d95928f"} Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.526907 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-5w8kn" event={"ID":"da103dab-8e46-466c-90db-c237910cc9e7","Type":"ContainerStarted","Data":"f94bfed4ac6c4d086a519a2ee06d6faf2432b4cb80bd659f3f4aaa3cb8c49093"} Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.534848 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-07bb-account-create-update-kvcc2" event={"ID":"d1223d0f-9cda-4590-9ae6-353c58886f99","Type":"ContainerStarted","Data":"a0b0bc7edc28e9baf93eb3e779ccab7e468fddfce2f1e8eb7dd11a88269efe3c"} Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.567313 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdh5n" event={"ID":"231f701c-d9c4-4157-bcf2-fe8875ce36e7","Type":"ContainerStarted","Data":"b684a72825a8b22fe4690f2297d04452cb5f4f0fdd16d107ac50e805be889f68"} Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.567350 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdh5n" event={"ID":"231f701c-d9c4-4157-bcf2-fe8875ce36e7","Type":"ContainerStarted","Data":"cd04a1c229c291e549b9fceb46cb2033f0e4b0f094154d28ee1aa1e3d7aa54f9"} Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.573664 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-5w8kn" podStartSLOduration=1.5736491639999999 podStartE2EDuration="1.573649164s" podCreationTimestamp="2026-02-19 09:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:01:28.550372116 +0000 UTC m=+990.538383588" watchObservedRunningTime="2026-02-19 09:01:28.573649164 +0000 UTC m=+990.561660636" Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.577144 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zzzf4"] Feb 19 09:01:28 crc kubenswrapper[4788]: W0219 09:01:28.605191 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b9a6da4_e188_4741_b5a4_60a33b8cd415.slice/crio-17de8b037b24be65da2f5a9db7d2b7ad1109ffee067673a39bfcd32c87fac056 WatchSource:0}: Error finding container 17de8b037b24be65da2f5a9db7d2b7ad1109ffee067673a39bfcd32c87fac056: Status 404 returned error can't find the container with id 17de8b037b24be65da2f5a9db7d2b7ad1109ffee067673a39bfcd32c87fac056 Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.630708 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d8ngr"] Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.696232 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b159-account-create-update-lk6tl"] Feb 19 09:01:28 crc kubenswrapper[4788]: W0219 09:01:28.706980 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd07fa4d7_5916_4288_9b30_0413795f6a69.slice/crio-5d2feb6e3e4e976efa5affa9726971a6f9c1a255a4748417c78a46945f43fe5f WatchSource:0}: Error finding container 5d2feb6e3e4e976efa5affa9726971a6f9c1a255a4748417c78a46945f43fe5f: Status 404 returned error can't find the container with id 5d2feb6e3e4e976efa5affa9726971a6f9c1a255a4748417c78a46945f43fe5f Feb 19 09:01:28 crc kubenswrapper[4788]: W0219 09:01:28.786088 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9eb8664_672b_45c0_a128_1e60f6ea6a0e.slice/crio-a03cdbac5a94bed0560a5dd8d9311d101fa75f1b5462ddf1809a7e9d1a829908 WatchSource:0}: Error finding container a03cdbac5a94bed0560a5dd8d9311d101fa75f1b5462ddf1809a7e9d1a829908: Status 404 returned error can't find the container with id a03cdbac5a94bed0560a5dd8d9311d101fa75f1b5462ddf1809a7e9d1a829908 Feb 19 09:01:28 crc kubenswrapper[4788]: I0219 09:01:28.807214 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-brnnm"] Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.588675 4788 generic.go:334] "Generic (PLEG): container finished" podID="59cdc318-8f14-4606-aa81-1a16a1ed697b" containerID="78057c63673a15e66a2ca94e3c6e6f824657a9acd19f9a6319b211f221f08fd0" exitCode=0 Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.589412 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0317-account-create-update-p9xsz" event={"ID":"59cdc318-8f14-4606-aa81-1a16a1ed697b","Type":"ContainerDied","Data":"78057c63673a15e66a2ca94e3c6e6f824657a9acd19f9a6319b211f221f08fd0"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.589447 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0317-account-create-update-p9xsz" event={"ID":"59cdc318-8f14-4606-aa81-1a16a1ed697b","Type":"ContainerStarted","Data":"7fbb8cb26186cf6591c33c98392e00c4a14c4638b79a479dc004480c0e8965b4"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.592461 4788 generic.go:334] "Generic (PLEG): container finished" podID="7b9a6da4-e188-4741-b5a4-60a33b8cd415" containerID="e16ff6da4ee97e161bb886b720b207a7f7e2740a3dbc8460a121b42b125b0f93" exitCode=0 Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.592544 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d8ngr" event={"ID":"7b9a6da4-e188-4741-b5a4-60a33b8cd415","Type":"ContainerDied","Data":"e16ff6da4ee97e161bb886b720b207a7f7e2740a3dbc8460a121b42b125b0f93"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.592576 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d8ngr" event={"ID":"7b9a6da4-e188-4741-b5a4-60a33b8cd415","Type":"ContainerStarted","Data":"17de8b037b24be65da2f5a9db7d2b7ad1109ffee067673a39bfcd32c87fac056"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.597500 4788 generic.go:334] "Generic (PLEG): container finished" podID="b9eb8664-672b-45c0-a128-1e60f6ea6a0e" containerID="09c8b320d144ac0814de10a4fb2b0761acc8a243245d4f68d2647914529742be" exitCode=0 Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.597578 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-brnnm" event={"ID":"b9eb8664-672b-45c0-a128-1e60f6ea6a0e","Type":"ContainerDied","Data":"09c8b320d144ac0814de10a4fb2b0761acc8a243245d4f68d2647914529742be"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.597609 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-brnnm" event={"ID":"b9eb8664-672b-45c0-a128-1e60f6ea6a0e","Type":"ContainerStarted","Data":"a03cdbac5a94bed0560a5dd8d9311d101fa75f1b5462ddf1809a7e9d1a829908"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.599432 4788 generic.go:334] "Generic (PLEG): container finished" podID="da103dab-8e46-466c-90db-c237910cc9e7" containerID="03c8e46258faa59402c50956ae0fdb7548577672ca54b8637f9ef9880d95928f" exitCode=0 Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.599530 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-5w8kn" event={"ID":"da103dab-8e46-466c-90db-c237910cc9e7","Type":"ContainerDied","Data":"03c8e46258faa59402c50956ae0fdb7548577672ca54b8637f9ef9880d95928f"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.602768 4788 generic.go:334] "Generic (PLEG): container finished" podID="d1223d0f-9cda-4590-9ae6-353c58886f99" containerID="4a9e154742d18f92d38fe4f84c0426a40a4f18da975faeaf7c398a442720a2a5" exitCode=0 Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.602842 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-07bb-account-create-update-kvcc2" event={"ID":"d1223d0f-9cda-4590-9ae6-353c58886f99","Type":"ContainerDied","Data":"4a9e154742d18f92d38fe4f84c0426a40a4f18da975faeaf7c398a442720a2a5"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.606185 4788 generic.go:334] "Generic (PLEG): container finished" podID="231f701c-d9c4-4157-bcf2-fe8875ce36e7" containerID="b684a72825a8b22fe4690f2297d04452cb5f4f0fdd16d107ac50e805be889f68" exitCode=0 Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.606277 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdh5n" event={"ID":"231f701c-d9c4-4157-bcf2-fe8875ce36e7","Type":"ContainerDied","Data":"b684a72825a8b22fe4690f2297d04452cb5f4f0fdd16d107ac50e805be889f68"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.620462 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zzzf4" event={"ID":"ccb2eae7-2f3d-424a-b805-b8452ceee91f","Type":"ContainerStarted","Data":"29b9ea324ea05aaac8dced61d0762c42873b3189e903b940e9309fc213dbf9e5"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.628386 4788 generic.go:334] "Generic (PLEG): container finished" podID="ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d" containerID="074375f36cab7ec00e9b4968ca05f676bd5421a49af0a0c78ed0bcee0a498ba9" exitCode=0 Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.628439 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fbfe-account-create-update-jmjqh" event={"ID":"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d","Type":"ContainerDied","Data":"074375f36cab7ec00e9b4968ca05f676bd5421a49af0a0c78ed0bcee0a498ba9"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.632563 4788 generic.go:334] "Generic (PLEG): container finished" podID="d07fa4d7-5916-4288-9b30-0413795f6a69" containerID="9820c56563cc79ee44a2fc15ef198e67fc06cb632c550f119cd9c88baa91b836" exitCode=0 Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.632601 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b159-account-create-update-lk6tl" event={"ID":"d07fa4d7-5916-4288-9b30-0413795f6a69","Type":"ContainerDied","Data":"9820c56563cc79ee44a2fc15ef198e67fc06cb632c550f119cd9c88baa91b836"} Feb 19 09:01:29 crc kubenswrapper[4788]: I0219 09:01:29.632620 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b159-account-create-update-lk6tl" event={"ID":"d07fa4d7-5916-4288-9b30-0413795f6a69","Type":"ContainerStarted","Data":"5d2feb6e3e4e976efa5affa9726971a6f9c1a255a4748417c78a46945f43fe5f"} Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.027922 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.092455 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f701c-d9c4-4157-bcf2-fe8875ce36e7-operator-scripts\") pod \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\" (UID: \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\") " Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.092702 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbtjm\" (UniqueName: \"kubernetes.io/projected/231f701c-d9c4-4157-bcf2-fe8875ce36e7-kube-api-access-rbtjm\") pod \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\" (UID: \"231f701c-d9c4-4157-bcf2-fe8875ce36e7\") " Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.094663 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/231f701c-d9c4-4157-bcf2-fe8875ce36e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "231f701c-d9c4-4157-bcf2-fe8875ce36e7" (UID: "231f701c-d9c4-4157-bcf2-fe8875ce36e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.100654 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231f701c-d9c4-4157-bcf2-fe8875ce36e7-kube-api-access-rbtjm" (OuterVolumeSpecName: "kube-api-access-rbtjm") pod "231f701c-d9c4-4157-bcf2-fe8875ce36e7" (UID: "231f701c-d9c4-4157-bcf2-fe8875ce36e7"). InnerVolumeSpecName "kube-api-access-rbtjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.194761 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbtjm\" (UniqueName: \"kubernetes.io/projected/231f701c-d9c4-4157-bcf2-fe8875ce36e7-kube-api-access-rbtjm\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.194808 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/231f701c-d9c4-4157-bcf2-fe8875ce36e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.645634 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jdh5n" Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.677982 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jdh5n" event={"ID":"231f701c-d9c4-4157-bcf2-fe8875ce36e7","Type":"ContainerDied","Data":"cd04a1c229c291e549b9fceb46cb2033f0e4b0f094154d28ee1aa1e3d7aa54f9"} Feb 19 09:01:30 crc kubenswrapper[4788]: I0219 09:01:30.678097 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd04a1c229c291e549b9fceb46cb2033f0e4b0f094154d28ee1aa1e3d7aa54f9" Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.230608 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.309991 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwfcl"] Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.310284 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-bwfcl" podUID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" containerName="dnsmasq-dns" containerID="cri-o://9396ef11e5bff8e56438a2fb5019baf319c491f2c99e39a05c7f25568cfddd0a" gracePeriod=10 Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.667483 4788 generic.go:334] "Generic (PLEG): container finished" podID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" containerID="9396ef11e5bff8e56438a2fb5019baf319c491f2c99e39a05c7f25568cfddd0a" exitCode=0 Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.667525 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwfcl" event={"ID":"9fa7a666-34c5-42b5-9c2b-25d39c505be2","Type":"ContainerDied","Data":"9396ef11e5bff8e56438a2fb5019baf319c491f2c99e39a05c7f25568cfddd0a"} Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.929783 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.950465 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.964290 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.969656 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87gxf\" (UniqueName: \"kubernetes.io/projected/d07fa4d7-5916-4288-9b30-0413795f6a69-kube-api-access-87gxf\") pod \"d07fa4d7-5916-4288-9b30-0413795f6a69\" (UID: \"d07fa4d7-5916-4288-9b30-0413795f6a69\") " Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.969774 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d07fa4d7-5916-4288-9b30-0413795f6a69-operator-scripts\") pod \"d07fa4d7-5916-4288-9b30-0413795f6a69\" (UID: \"d07fa4d7-5916-4288-9b30-0413795f6a69\") " Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.970842 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07fa4d7-5916-4288-9b30-0413795f6a69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d07fa4d7-5916-4288-9b30-0413795f6a69" (UID: "d07fa4d7-5916-4288-9b30-0413795f6a69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.972387 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:33 crc kubenswrapper[4788]: I0219 09:01:33.978302 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d07fa4d7-5916-4288-9b30-0413795f6a69-kube-api-access-87gxf" (OuterVolumeSpecName: "kube-api-access-87gxf") pod "d07fa4d7-5916-4288-9b30-0413795f6a69" (UID: "d07fa4d7-5916-4288-9b30-0413795f6a69"). InnerVolumeSpecName "kube-api-access-87gxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.015337 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.038488 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.052689 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075121 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59cdc318-8f14-4606-aa81-1a16a1ed697b-operator-scripts\") pod \"59cdc318-8f14-4606-aa81-1a16a1ed697b\" (UID: \"59cdc318-8f14-4606-aa81-1a16a1ed697b\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075164 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b9a6da4-e188-4741-b5a4-60a33b8cd415-operator-scripts\") pod \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\" (UID: \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075185 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6xtt\" (UniqueName: \"kubernetes.io/projected/7b9a6da4-e188-4741-b5a4-60a33b8cd415-kube-api-access-b6xtt\") pod \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\" (UID: \"7b9a6da4-e188-4741-b5a4-60a33b8cd415\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075210 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1223d0f-9cda-4590-9ae6-353c58886f99-operator-scripts\") pod \"d1223d0f-9cda-4590-9ae6-353c58886f99\" (UID: \"d1223d0f-9cda-4590-9ae6-353c58886f99\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075338 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv5cx\" (UniqueName: \"kubernetes.io/projected/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-kube-api-access-xv5cx\") pod \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\" (UID: \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075355 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-operator-scripts\") pod \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\" (UID: \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075380 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksk9m\" (UniqueName: \"kubernetes.io/projected/d1223d0f-9cda-4590-9ae6-353c58886f99-kube-api-access-ksk9m\") pod \"d1223d0f-9cda-4590-9ae6-353c58886f99\" (UID: \"d1223d0f-9cda-4590-9ae6-353c58886f99\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075406 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psgdr\" (UniqueName: \"kubernetes.io/projected/59cdc318-8f14-4606-aa81-1a16a1ed697b-kube-api-access-psgdr\") pod \"59cdc318-8f14-4606-aa81-1a16a1ed697b\" (UID: \"59cdc318-8f14-4606-aa81-1a16a1ed697b\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075437 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-operator-scripts\") pod \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\" (UID: \"b9eb8664-672b-45c0-a128-1e60f6ea6a0e\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075472 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldq45\" (UniqueName: \"kubernetes.io/projected/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-kube-api-access-ldq45\") pod \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\" (UID: \"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075752 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d07fa4d7-5916-4288-9b30-0413795f6a69-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.075767 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87gxf\" (UniqueName: \"kubernetes.io/projected/d07fa4d7-5916-4288-9b30-0413795f6a69-kube-api-access-87gxf\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.076226 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1223d0f-9cda-4590-9ae6-353c58886f99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1223d0f-9cda-4590-9ae6-353c58886f99" (UID: "d1223d0f-9cda-4590-9ae6-353c58886f99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.076293 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b9a6da4-e188-4741-b5a4-60a33b8cd415-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b9a6da4-e188-4741-b5a4-60a33b8cd415" (UID: "7b9a6da4-e188-4741-b5a4-60a33b8cd415"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.077032 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d" (UID: "ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.077139 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b9eb8664-672b-45c0-a128-1e60f6ea6a0e" (UID: "b9eb8664-672b-45c0-a128-1e60f6ea6a0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.080268 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.081763 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9a6da4-e188-4741-b5a4-60a33b8cd415-kube-api-access-b6xtt" (OuterVolumeSpecName: "kube-api-access-b6xtt") pod "7b9a6da4-e188-4741-b5a4-60a33b8cd415" (UID: "7b9a6da4-e188-4741-b5a4-60a33b8cd415"). InnerVolumeSpecName "kube-api-access-b6xtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.082317 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-kube-api-access-ldq45" (OuterVolumeSpecName: "kube-api-access-ldq45") pod "ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d" (UID: "ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d"). InnerVolumeSpecName "kube-api-access-ldq45". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.082428 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59cdc318-8f14-4606-aa81-1a16a1ed697b-kube-api-access-psgdr" (OuterVolumeSpecName: "kube-api-access-psgdr") pod "59cdc318-8f14-4606-aa81-1a16a1ed697b" (UID: "59cdc318-8f14-4606-aa81-1a16a1ed697b"). InnerVolumeSpecName "kube-api-access-psgdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.082496 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1223d0f-9cda-4590-9ae6-353c58886f99-kube-api-access-ksk9m" (OuterVolumeSpecName: "kube-api-access-ksk9m") pod "d1223d0f-9cda-4590-9ae6-353c58886f99" (UID: "d1223d0f-9cda-4590-9ae6-353c58886f99"). InnerVolumeSpecName "kube-api-access-ksk9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.086459 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59cdc318-8f14-4606-aa81-1a16a1ed697b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59cdc318-8f14-4606-aa81-1a16a1ed697b" (UID: "59cdc318-8f14-4606-aa81-1a16a1ed697b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.092049 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-kube-api-access-xv5cx" (OuterVolumeSpecName: "kube-api-access-xv5cx") pod "b9eb8664-672b-45c0-a128-1e60f6ea6a0e" (UID: "b9eb8664-672b-45c0-a128-1e60f6ea6a0e"). InnerVolumeSpecName "kube-api-access-xv5cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.176835 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-dns-svc\") pod \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177205 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8nwb\" (UniqueName: \"kubernetes.io/projected/9fa7a666-34c5-42b5-9c2b-25d39c505be2-kube-api-access-c8nwb\") pod \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177315 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-config\") pod \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177342 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da103dab-8e46-466c-90db-c237910cc9e7-operator-scripts\") pod \"da103dab-8e46-466c-90db-c237910cc9e7\" (UID: \"da103dab-8e46-466c-90db-c237910cc9e7\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177408 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-sb\") pod \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177439 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gw2n\" (UniqueName: \"kubernetes.io/projected/da103dab-8e46-466c-90db-c237910cc9e7-kube-api-access-2gw2n\") pod \"da103dab-8e46-466c-90db-c237910cc9e7\" (UID: \"da103dab-8e46-466c-90db-c237910cc9e7\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177510 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-nb\") pod \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\" (UID: \"9fa7a666-34c5-42b5-9c2b-25d39c505be2\") " Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177891 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psgdr\" (UniqueName: \"kubernetes.io/projected/59cdc318-8f14-4606-aa81-1a16a1ed697b-kube-api-access-psgdr\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177909 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177902 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da103dab-8e46-466c-90db-c237910cc9e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "da103dab-8e46-466c-90db-c237910cc9e7" (UID: "da103dab-8e46-466c-90db-c237910cc9e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177921 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldq45\" (UniqueName: \"kubernetes.io/projected/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-kube-api-access-ldq45\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177970 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59cdc318-8f14-4606-aa81-1a16a1ed697b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177981 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b9a6da4-e188-4741-b5a4-60a33b8cd415-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.177991 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6xtt\" (UniqueName: \"kubernetes.io/projected/7b9a6da4-e188-4741-b5a4-60a33b8cd415-kube-api-access-b6xtt\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.178003 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1223d0f-9cda-4590-9ae6-353c58886f99-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.178012 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv5cx\" (UniqueName: \"kubernetes.io/projected/b9eb8664-672b-45c0-a128-1e60f6ea6a0e-kube-api-access-xv5cx\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.178021 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.178032 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksk9m\" (UniqueName: \"kubernetes.io/projected/d1223d0f-9cda-4590-9ae6-353c58886f99-kube-api-access-ksk9m\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.181948 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fa7a666-34c5-42b5-9c2b-25d39c505be2-kube-api-access-c8nwb" (OuterVolumeSpecName: "kube-api-access-c8nwb") pod "9fa7a666-34c5-42b5-9c2b-25d39c505be2" (UID: "9fa7a666-34c5-42b5-9c2b-25d39c505be2"). InnerVolumeSpecName "kube-api-access-c8nwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.182725 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da103dab-8e46-466c-90db-c237910cc9e7-kube-api-access-2gw2n" (OuterVolumeSpecName: "kube-api-access-2gw2n") pod "da103dab-8e46-466c-90db-c237910cc9e7" (UID: "da103dab-8e46-466c-90db-c237910cc9e7"). InnerVolumeSpecName "kube-api-access-2gw2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.220945 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fa7a666-34c5-42b5-9c2b-25d39c505be2" (UID: "9fa7a666-34c5-42b5-9c2b-25d39c505be2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.221920 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-config" (OuterVolumeSpecName: "config") pod "9fa7a666-34c5-42b5-9c2b-25d39c505be2" (UID: "9fa7a666-34c5-42b5-9c2b-25d39c505be2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.224421 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9fa7a666-34c5-42b5-9c2b-25d39c505be2" (UID: "9fa7a666-34c5-42b5-9c2b-25d39c505be2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.238742 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9fa7a666-34c5-42b5-9c2b-25d39c505be2" (UID: "9fa7a666-34c5-42b5-9c2b-25d39c505be2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.279603 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.279646 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/da103dab-8e46-466c-90db-c237910cc9e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.279664 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.279682 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gw2n\" (UniqueName: \"kubernetes.io/projected/da103dab-8e46-466c-90db-c237910cc9e7-kube-api-access-2gw2n\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.279698 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.279736 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fa7a666-34c5-42b5-9c2b-25d39c505be2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.279763 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8nwb\" (UniqueName: \"kubernetes.io/projected/9fa7a666-34c5-42b5-9c2b-25d39c505be2-kube-api-access-c8nwb\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.681440 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-07bb-account-create-update-kvcc2" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.681415 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-07bb-account-create-update-kvcc2" event={"ID":"d1223d0f-9cda-4590-9ae6-353c58886f99","Type":"ContainerDied","Data":"a0b0bc7edc28e9baf93eb3e779ccab7e468fddfce2f1e8eb7dd11a88269efe3c"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.681623 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0b0bc7edc28e9baf93eb3e779ccab7e468fddfce2f1e8eb7dd11a88269efe3c" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.684063 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-bwfcl" event={"ID":"9fa7a666-34c5-42b5-9c2b-25d39c505be2","Type":"ContainerDied","Data":"712f25f3544699b4ab45bf3cebc04e4bfcda7dc2eeacb8dbde130902f1d6f671"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.684109 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-bwfcl" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.684199 4788 scope.go:117] "RemoveContainer" containerID="9396ef11e5bff8e56438a2fb5019baf319c491f2c99e39a05c7f25568cfddd0a" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.686314 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zzzf4" event={"ID":"ccb2eae7-2f3d-424a-b805-b8452ceee91f","Type":"ContainerStarted","Data":"960586f2ddb2bd161f18289f539caa6c3a3e969df144acb5a08f9c63b0606487"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.688563 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0317-account-create-update-p9xsz" event={"ID":"59cdc318-8f14-4606-aa81-1a16a1ed697b","Type":"ContainerDied","Data":"7fbb8cb26186cf6591c33c98392e00c4a14c4638b79a479dc004480c0e8965b4"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.688599 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fbb8cb26186cf6591c33c98392e00c4a14c4638b79a479dc004480c0e8965b4" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.688614 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0317-account-create-update-p9xsz" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.691863 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d8ngr" event={"ID":"7b9a6da4-e188-4741-b5a4-60a33b8cd415","Type":"ContainerDied","Data":"17de8b037b24be65da2f5a9db7d2b7ad1109ffee067673a39bfcd32c87fac056"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.692225 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17de8b037b24be65da2f5a9db7d2b7ad1109ffee067673a39bfcd32c87fac056" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.691953 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d8ngr" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.693789 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fbfe-account-create-update-jmjqh" event={"ID":"ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d","Type":"ContainerDied","Data":"6037864e63a6ddf0e056b72e75b44a68dc6caeef20e718ab2db587eb9f6942f0"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.693932 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6037864e63a6ddf0e056b72e75b44a68dc6caeef20e718ab2db587eb9f6942f0" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.694046 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fbfe-account-create-update-jmjqh" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.697594 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b159-account-create-update-lk6tl" event={"ID":"d07fa4d7-5916-4288-9b30-0413795f6a69","Type":"ContainerDied","Data":"5d2feb6e3e4e976efa5affa9726971a6f9c1a255a4748417c78a46945f43fe5f"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.697632 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d2feb6e3e4e976efa5affa9726971a6f9c1a255a4748417c78a46945f43fe5f" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.697665 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b159-account-create-update-lk6tl" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.699909 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-5w8kn" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.700418 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-5w8kn" event={"ID":"da103dab-8e46-466c-90db-c237910cc9e7","Type":"ContainerDied","Data":"f94bfed4ac6c4d086a519a2ee06d6faf2432b4cb80bd659f3f4aaa3cb8c49093"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.700470 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f94bfed4ac6c4d086a519a2ee06d6faf2432b4cb80bd659f3f4aaa3cb8c49093" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.702780 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-brnnm" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.702766 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-brnnm" event={"ID":"b9eb8664-672b-45c0-a128-1e60f6ea6a0e","Type":"ContainerDied","Data":"a03cdbac5a94bed0560a5dd8d9311d101fa75f1b5462ddf1809a7e9d1a829908"} Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.702950 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a03cdbac5a94bed0560a5dd8d9311d101fa75f1b5462ddf1809a7e9d1a829908" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.720926 4788 scope.go:117] "RemoveContainer" containerID="6bd6ea079e989126512aaf5ce85ea03d4bb3cc021826f03dd8d39ec46a0829bb" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.881929 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-zzzf4" podStartSLOduration=2.685194483 podStartE2EDuration="7.881911404s" podCreationTimestamp="2026-02-19 09:01:27 +0000 UTC" firstStartedPulling="2026-02-19 09:01:28.597515958 +0000 UTC m=+990.585527440" lastFinishedPulling="2026-02-19 09:01:33.794232889 +0000 UTC m=+995.782244361" observedRunningTime="2026-02-19 09:01:34.712122138 +0000 UTC m=+996.700133650" watchObservedRunningTime="2026-02-19 09:01:34.881911404 +0000 UTC m=+996.869922886" Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.889572 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwfcl"] Feb 19 09:01:34 crc kubenswrapper[4788]: I0219 09:01:34.897291 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-bwfcl"] Feb 19 09:01:36 crc kubenswrapper[4788]: I0219 09:01:36.731203 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" path="/var/lib/kubelet/pods/9fa7a666-34c5-42b5-9c2b-25d39c505be2/volumes" Feb 19 09:01:37 crc kubenswrapper[4788]: I0219 09:01:37.732089 4788 generic.go:334] "Generic (PLEG): container finished" podID="ccb2eae7-2f3d-424a-b805-b8452ceee91f" containerID="960586f2ddb2bd161f18289f539caa6c3a3e969df144acb5a08f9c63b0606487" exitCode=0 Feb 19 09:01:37 crc kubenswrapper[4788]: I0219 09:01:37.732137 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zzzf4" event={"ID":"ccb2eae7-2f3d-424a-b805-b8452ceee91f","Type":"ContainerDied","Data":"960586f2ddb2bd161f18289f539caa6c3a3e969df144acb5a08f9c63b0606487"} Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.101068 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.178780 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-config-data\") pod \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.179011 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh59s\" (UniqueName: \"kubernetes.io/projected/ccb2eae7-2f3d-424a-b805-b8452ceee91f-kube-api-access-qh59s\") pod \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.179079 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-combined-ca-bundle\") pod \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\" (UID: \"ccb2eae7-2f3d-424a-b805-b8452ceee91f\") " Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.189401 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb2eae7-2f3d-424a-b805-b8452ceee91f-kube-api-access-qh59s" (OuterVolumeSpecName: "kube-api-access-qh59s") pod "ccb2eae7-2f3d-424a-b805-b8452ceee91f" (UID: "ccb2eae7-2f3d-424a-b805-b8452ceee91f"). InnerVolumeSpecName "kube-api-access-qh59s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.216506 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccb2eae7-2f3d-424a-b805-b8452ceee91f" (UID: "ccb2eae7-2f3d-424a-b805-b8452ceee91f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.233923 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-config-data" (OuterVolumeSpecName: "config-data") pod "ccb2eae7-2f3d-424a-b805-b8452ceee91f" (UID: "ccb2eae7-2f3d-424a-b805-b8452ceee91f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.283942 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.284003 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh59s\" (UniqueName: \"kubernetes.io/projected/ccb2eae7-2f3d-424a-b805-b8452ceee91f-kube-api-access-qh59s\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.284033 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb2eae7-2f3d-424a-b805-b8452ceee91f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.748558 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zzzf4" event={"ID":"ccb2eae7-2f3d-424a-b805-b8452ceee91f","Type":"ContainerDied","Data":"29b9ea324ea05aaac8dced61d0762c42873b3189e903b940e9309fc213dbf9e5"} Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.748604 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29b9ea324ea05aaac8dced61d0762c42873b3189e903b940e9309fc213dbf9e5" Feb 19 09:01:39 crc kubenswrapper[4788]: I0219 09:01:39.748690 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zzzf4" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.039606 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-vt8wd"] Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040053 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59cdc318-8f14-4606-aa81-1a16a1ed697b" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040079 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="59cdc318-8f14-4606-aa81-1a16a1ed697b" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040101 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07fa4d7-5916-4288-9b30-0413795f6a69" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040112 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07fa4d7-5916-4288-9b30-0413795f6a69" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040124 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040132 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040153 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231f701c-d9c4-4157-bcf2-fe8875ce36e7" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040168 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="231f701c-d9c4-4157-bcf2-fe8875ce36e7" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040175 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" containerName="dnsmasq-dns" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040182 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" containerName="dnsmasq-dns" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040202 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb8664-672b-45c0-a128-1e60f6ea6a0e" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040214 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb8664-672b-45c0-a128-1e60f6ea6a0e" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040230 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9a6da4-e188-4741-b5a4-60a33b8cd415" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040238 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9a6da4-e188-4741-b5a4-60a33b8cd415" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040268 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1223d0f-9cda-4590-9ae6-353c58886f99" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040278 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1223d0f-9cda-4590-9ae6-353c58886f99" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040292 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb2eae7-2f3d-424a-b805-b8452ceee91f" containerName="keystone-db-sync" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040298 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb2eae7-2f3d-424a-b805-b8452ceee91f" containerName="keystone-db-sync" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040311 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" containerName="init" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040317 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" containerName="init" Feb 19 09:01:40 crc kubenswrapper[4788]: E0219 09:01:40.040327 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da103dab-8e46-466c-90db-c237910cc9e7" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040333 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="da103dab-8e46-466c-90db-c237910cc9e7" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040503 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fa7a666-34c5-42b5-9c2b-25d39c505be2" containerName="dnsmasq-dns" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040516 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="da103dab-8e46-466c-90db-c237910cc9e7" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040524 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040540 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="d07fa4d7-5916-4288-9b30-0413795f6a69" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040550 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="59cdc318-8f14-4606-aa81-1a16a1ed697b" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040556 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb2eae7-2f3d-424a-b805-b8452ceee91f" containerName="keystone-db-sync" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040565 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9a6da4-e188-4741-b5a4-60a33b8cd415" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040573 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eb8664-672b-45c0-a128-1e60f6ea6a0e" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040582 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="231f701c-d9c4-4157-bcf2-fe8875ce36e7" containerName="mariadb-database-create" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.040595 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1223d0f-9cda-4590-9ae6-353c58886f99" containerName="mariadb-account-create-update" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.041450 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.052314 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-vt8wd"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.078366 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vprwn"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.079518 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.086479 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.086714 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.086818 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rzps4" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.087294 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.087872 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.091106 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vprwn"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.101201 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9h5s\" (UniqueName: \"kubernetes.io/projected/1d1f649e-07c2-498d-8a63-4e7192c84af1-kube-api-access-r9h5s\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.101277 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.101354 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.101375 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-svc\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.101410 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.101426 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-config\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.180322 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-wf626"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.181369 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.183057 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wf626"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.187925 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-tjzqh" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.188127 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203519 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-scripts\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203598 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-fernet-keys\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203681 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203705 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-credential-keys\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203733 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-svc\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203763 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-combined-ca-bundle\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203808 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203831 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-config\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203891 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9h5s\" (UniqueName: \"kubernetes.io/projected/1d1f649e-07c2-498d-8a63-4e7192c84af1-kube-api-access-r9h5s\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203915 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-config-data\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.203954 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.204004 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rq8l\" (UniqueName: \"kubernetes.io/projected/2d236290-fb64-4f36-9c80-1b7c9d74cca4-kube-api-access-6rq8l\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.205191 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.205841 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.206319 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.207591 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-config\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.208822 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-svc\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.236338 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9h5s\" (UniqueName: \"kubernetes.io/projected/1d1f649e-07c2-498d-8a63-4e7192c84af1-kube-api-access-r9h5s\") pod \"dnsmasq-dns-847c4cc679-vt8wd\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305551 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rq8l\" (UniqueName: \"kubernetes.io/projected/2d236290-fb64-4f36-9c80-1b7c9d74cca4-kube-api-access-6rq8l\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305604 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-scripts\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305638 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-fernet-keys\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305692 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-credential-keys\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305716 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-combined-ca-bundle\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305747 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn525\" (UniqueName: \"kubernetes.io/projected/708e9c03-709f-4846-bf0f-abb71c9e164f-kube-api-access-xn525\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305788 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-config-data\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305806 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-combined-ca-bundle\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.305827 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-config-data\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.314944 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-combined-ca-bundle\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.318146 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-scripts\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.319819 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-8lpgx"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.321267 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.323181 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-fernet-keys\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.324124 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-credential-keys\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.338904 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.339229 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vg8sl" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.339619 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.341159 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-config-data\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.358340 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.363575 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.366882 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.369941 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rq8l\" (UniqueName: \"kubernetes.io/projected/2d236290-fb64-4f36-9c80-1b7c9d74cca4-kube-api-access-6rq8l\") pod \"keystone-bootstrap-vprwn\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.374305 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8lpgx"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.374649 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.380567 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.410312 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411239 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctgvz\" (UniqueName: \"kubernetes.io/projected/c2a3acb8-146c-47c0-9218-81cd2728edf9-kube-api-access-ctgvz\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411303 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn525\" (UniqueName: \"kubernetes.io/projected/708e9c03-709f-4846-bf0f-abb71c9e164f-kube-api-access-xn525\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411341 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-log-httpd\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411374 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-config-data\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411391 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r69hc\" (UniqueName: \"kubernetes.io/projected/604aa3ed-40d6-437a-93f3-0e7a445b862b-kube-api-access-r69hc\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411412 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-combined-ca-bundle\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411430 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-scripts\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411451 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-combined-ca-bundle\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411471 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411487 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-config-data\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411506 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-scripts\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411536 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-db-sync-config-data\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411570 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2a3acb8-146c-47c0-9218-81cd2728edf9-etc-machine-id\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411585 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-run-httpd\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411617 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-config-data\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.411645 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.432462 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ps9px"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.435225 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-combined-ca-bundle\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.441735 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.450605 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.450745 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn525\" (UniqueName: \"kubernetes.io/projected/708e9c03-709f-4846-bf0f-abb71c9e164f-kube-api-access-xn525\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.456903 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-config-data\") pod \"heat-db-sync-wf626\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.468079 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.468361 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zm5vb" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.468470 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.474329 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ps9px"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.506508 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wllnd"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.507936 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.518213 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zxtj7" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.518605 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.528173 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wf626" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548070 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-config-data\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548138 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r69hc\" (UniqueName: \"kubernetes.io/projected/604aa3ed-40d6-437a-93f3-0e7a445b862b-kube-api-access-r69hc\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548164 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-combined-ca-bundle\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548189 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-scripts\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548217 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548257 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-scripts\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548282 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-config\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548310 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-db-sync-config-data\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548344 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2a3acb8-146c-47c0-9218-81cd2728edf9-etc-machine-id\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548366 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-run-httpd\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548386 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-config-data\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548407 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjf97\" (UniqueName: \"kubernetes.io/projected/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-kube-api-access-vjf97\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548433 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548453 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-combined-ca-bundle\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548510 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctgvz\" (UniqueName: \"kubernetes.io/projected/c2a3acb8-146c-47c0-9218-81cd2728edf9-kube-api-access-ctgvz\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.548543 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-log-httpd\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.549145 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-log-httpd\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.572011 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2a3acb8-146c-47c0-9218-81cd2728edf9-etc-machine-id\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.575921 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-run-httpd\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.580183 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-config-data\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.586602 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r69hc\" (UniqueName: \"kubernetes.io/projected/604aa3ed-40d6-437a-93f3-0e7a445b862b-kube-api-access-r69hc\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.589161 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-config-data\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.590161 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-scripts\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.605206 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-combined-ca-bundle\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.607710 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.612417 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-scripts\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.635057 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.651360 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wllnd"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.651460 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-db-sync-config-data\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.651984 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-config\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.652055 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-combined-ca-bundle\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.652626 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjf97\" (UniqueName: \"kubernetes.io/projected/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-kube-api-access-vjf97\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.652671 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-combined-ca-bundle\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.653185 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhg86\" (UniqueName: \"kubernetes.io/projected/145312e4-8a69-4c17-964b-2183e2ff66b4-kube-api-access-zhg86\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.656507 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-db-sync-config-data\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.659264 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctgvz\" (UniqueName: \"kubernetes.io/projected/c2a3acb8-146c-47c0-9218-81cd2728edf9-kube-api-access-ctgvz\") pod \"cinder-db-sync-8lpgx\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.663179 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-combined-ca-bundle\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.690324 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-vt8wd"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.690721 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-config\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.699812 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjf97\" (UniqueName: \"kubernetes.io/projected/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-kube-api-access-vjf97\") pod \"neutron-db-sync-ps9px\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.706545 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ps9px" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.745353 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2dgzz"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.749093 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2dgzz"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.749306 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.759021 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-db-sync-config-data\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.759134 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-combined-ca-bundle\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.759217 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg86\" (UniqueName: \"kubernetes.io/projected/145312e4-8a69-4c17-964b-2183e2ff66b4-kube-api-access-zhg86\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.765408 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-db-sync-config-data\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.768674 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.770117 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gvv2s" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.770987 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.772580 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-combined-ca-bundle\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.788883 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-z4jjr"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.790183 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.793344 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-z4jjr"] Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.796187 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhg86\" (UniqueName: \"kubernetes.io/projected/145312e4-8a69-4c17-964b-2183e2ff66b4-kube-api-access-zhg86\") pod \"barbican-db-sync-wllnd\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.861445 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.861619 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mhwm\" (UniqueName: \"kubernetes.io/projected/c7fed6a4-4d87-463f-84d2-942c28422b8b-kube-api-access-4mhwm\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.862659 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.862943 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-config\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.862991 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94j7h\" (UniqueName: \"kubernetes.io/projected/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-kube-api-access-94j7h\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.863024 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.863061 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-config-data\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.863095 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7fed6a4-4d87-463f-84d2-942c28422b8b-logs\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.863138 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.863175 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-scripts\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.863198 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.863258 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-combined-ca-bundle\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.892496 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966176 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966231 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-config\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966284 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94j7h\" (UniqueName: \"kubernetes.io/projected/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-kube-api-access-94j7h\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966314 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966345 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-config-data\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966376 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7fed6a4-4d87-463f-84d2-942c28422b8b-logs\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966405 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966428 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-scripts\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966447 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966479 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-combined-ca-bundle\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966540 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mhwm\" (UniqueName: \"kubernetes.io/projected/c7fed6a4-4d87-463f-84d2-942c28422b8b-kube-api-access-4mhwm\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.966985 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7fed6a4-4d87-463f-84d2-942c28422b8b-logs\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.967192 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-config\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.967512 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.967622 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.968084 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.969432 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.977792 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-config-data\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.980482 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-combined-ca-bundle\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.983043 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-scripts\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.984152 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mhwm\" (UniqueName: \"kubernetes.io/projected/c7fed6a4-4d87-463f-84d2-942c28422b8b-kube-api-access-4mhwm\") pod \"placement-db-sync-2dgzz\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:40 crc kubenswrapper[4788]: I0219 09:01:40.988366 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94j7h\" (UniqueName: \"kubernetes.io/projected/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-kube-api-access-94j7h\") pod \"dnsmasq-dns-785d8bcb8c-z4jjr\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.028987 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wllnd" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.092540 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2dgzz" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.126484 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-vt8wd"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.145174 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.188820 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.190583 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.196972 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.197355 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9pt5q" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.197652 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.208643 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.208737 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.267074 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wf626"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.273119 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.273237 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.273297 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.273333 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrqvk\" (UniqueName: \"kubernetes.io/projected/6ac74889-bac3-455a-83aa-b668300ad25d-kube-api-access-lrqvk\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.273375 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.273403 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.273428 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-logs\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.273452 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.296394 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vprwn"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.371092 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.372503 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377513 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377582 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377591 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377651 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrqvk\" (UniqueName: \"kubernetes.io/projected/6ac74889-bac3-455a-83aa-b668300ad25d-kube-api-access-lrqvk\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377697 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377734 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377768 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-logs\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377788 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377791 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.377858 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.378162 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.384055 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-logs\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.386482 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.401965 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.402847 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.407864 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.418274 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.459164 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ps9px"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.459865 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.467699 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.482347 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrqvk\" (UniqueName: \"kubernetes.io/projected/6ac74889-bac3-455a-83aa-b668300ad25d-kube-api-access-lrqvk\") pod \"glance-default-external-api-0\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.497950 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.498007 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.498084 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlvln\" (UniqueName: \"kubernetes.io/projected/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-kube-api-access-xlvln\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.498120 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.498179 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.498269 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.498288 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.498323 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.544826 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.608350 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.608810 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.608848 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.608900 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.608932 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.608991 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlvln\" (UniqueName: \"kubernetes.io/projected/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-kube-api-access-xlvln\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.609029 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.609074 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.611631 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-logs\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.612736 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.615579 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.622468 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.651024 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.651856 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.652468 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.653349 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.654380 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlvln\" (UniqueName: \"kubernetes.io/projected/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-kube-api-access-xlvln\") pod \"glance-default-internal-api-0\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.722801 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.777680 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-8lpgx"] Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.793900 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.804399 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"604aa3ed-40d6-437a-93f3-0e7a445b862b","Type":"ContainerStarted","Data":"4cf532b6079b581b5ba944a8188f27318753a4102ffc74eddfc37f713c6f5e1c"} Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.821753 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" event={"ID":"1d1f649e-07c2-498d-8a63-4e7192c84af1","Type":"ContainerStarted","Data":"c5d912df9b700f0797ef3bc6756ff88927fa1e0a56f6853d287d549ffaa40fdf"} Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.823978 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vprwn" event={"ID":"2d236290-fb64-4f36-9c80-1b7c9d74cca4","Type":"ContainerStarted","Data":"ae90054f68659cf839baf08db8b9f8430fe7d745b94ca799c61a4f766bebd0d9"} Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.824949 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8lpgx" event={"ID":"c2a3acb8-146c-47c0-9218-81cd2728edf9","Type":"ContainerStarted","Data":"b8d12f04b1f95d703245bce0e2aa59c9e26cb7d0a4f80e022d389beb950dd1d8"} Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.826036 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ps9px" event={"ID":"528e3c62-47f6-4cf1-8b32-06dd6657c9f6","Type":"ContainerStarted","Data":"a807277345138cf730d5feb3906024ffe536e19ffb481911a0415cca1c29a95e"} Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.827116 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wf626" event={"ID":"708e9c03-709f-4846-bf0f-abb71c9e164f","Type":"ContainerStarted","Data":"c9a6caf271aed008f9b9370b0f59d811690c08352baddb40c5e6826f7b5a7d7f"} Feb 19 09:01:41 crc kubenswrapper[4788]: I0219 09:01:41.954382 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wllnd"] Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.074746 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2dgzz"] Feb 19 09:01:42 crc kubenswrapper[4788]: W0219 09:01:42.083463 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7fed6a4_4d87_463f_84d2_942c28422b8b.slice/crio-4e677ffb6919250ddcb8b49a52adeb01dc87ae16c3ce6763a3cb74bfa3a1136f WatchSource:0}: Error finding container 4e677ffb6919250ddcb8b49a52adeb01dc87ae16c3ce6763a3cb74bfa3a1136f: Status 404 returned error can't find the container with id 4e677ffb6919250ddcb8b49a52adeb01dc87ae16c3ce6763a3cb74bfa3a1136f Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.154205 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-z4jjr"] Feb 19 09:01:42 crc kubenswrapper[4788]: W0219 09:01:42.161373 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e6a3b3f_85b6_4492_95cb_5834f14fe5b1.slice/crio-a8c4331829dd93bb91821a9282eaa46d902a77e738862fb7029964ae2dc9d06e WatchSource:0}: Error finding container a8c4331829dd93bb91821a9282eaa46d902a77e738862fb7029964ae2dc9d06e: Status 404 returned error can't find the container with id a8c4331829dd93bb91821a9282eaa46d902a77e738862fb7029964ae2dc9d06e Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.405764 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:01:42 crc kubenswrapper[4788]: W0219 09:01:42.446205 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac74889_bac3_455a_83aa_b668300ad25d.slice/crio-0518abee2cdd92ed8369e88f4931394f0898a622c45e4fea0e8d768665d23a5c WatchSource:0}: Error finding container 0518abee2cdd92ed8369e88f4931394f0898a622c45e4fea0e8d768665d23a5c: Status 404 returned error can't find the container with id 0518abee2cdd92ed8369e88f4931394f0898a622c45e4fea0e8d768665d23a5c Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.562001 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.612303 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.641894 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.657504 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.905347 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vprwn" event={"ID":"2d236290-fb64-4f36-9c80-1b7c9d74cca4","Type":"ContainerStarted","Data":"d4da65a43c21e2c1ef2ae88da20296925ef9afa26feac2315d52eecff708f338"} Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.908734 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d","Type":"ContainerStarted","Data":"8669e5c262d13b089341d2cc0ac2ab40a499efccfe24d55360a82795a371152c"} Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.911380 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wllnd" event={"ID":"145312e4-8a69-4c17-964b-2183e2ff66b4","Type":"ContainerStarted","Data":"a3903643c6a0b9b17661e85d4080cef7b7bd5b0e0497adfc6a315343ea5f2844"} Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.930962 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ps9px" event={"ID":"528e3c62-47f6-4cf1-8b32-06dd6657c9f6","Type":"ContainerStarted","Data":"dd13df32eb25b537aa018234d33c9f9669b9d9298b92023335532fb4b817d8b8"} Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.936807 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2dgzz" event={"ID":"c7fed6a4-4d87-463f-84d2-942c28422b8b","Type":"ContainerStarted","Data":"4e677ffb6919250ddcb8b49a52adeb01dc87ae16c3ce6763a3cb74bfa3a1136f"} Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.951775 4788 generic.go:334] "Generic (PLEG): container finished" podID="1d1f649e-07c2-498d-8a63-4e7192c84af1" containerID="98a19927eaddf0c0bbc54011b39b0626fab2e12aaeb247343c51c026524db5ce" exitCode=0 Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.951879 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" event={"ID":"1d1f649e-07c2-498d-8a63-4e7192c84af1","Type":"ContainerDied","Data":"98a19927eaddf0c0bbc54011b39b0626fab2e12aaeb247343c51c026524db5ce"} Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.952813 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vprwn" podStartSLOduration=2.952799346 podStartE2EDuration="2.952799346s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:01:42.926581457 +0000 UTC m=+1004.914592929" watchObservedRunningTime="2026-02-19 09:01:42.952799346 +0000 UTC m=+1004.940810818" Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.958752 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ps9px" podStartSLOduration=2.958723048 podStartE2EDuration="2.958723048s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:01:42.949778624 +0000 UTC m=+1004.937790096" watchObservedRunningTime="2026-02-19 09:01:42.958723048 +0000 UTC m=+1004.946734520" Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.970494 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ac74889-bac3-455a-83aa-b668300ad25d","Type":"ContainerStarted","Data":"0518abee2cdd92ed8369e88f4931394f0898a622c45e4fea0e8d768665d23a5c"} Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.979276 4788 generic.go:334] "Generic (PLEG): container finished" podID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" containerID="0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6" exitCode=0 Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.979332 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" event={"ID":"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1","Type":"ContainerDied","Data":"0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6"} Feb 19 09:01:42 crc kubenswrapper[4788]: I0219 09:01:42.979375 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" event={"ID":"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1","Type":"ContainerStarted","Data":"a8c4331829dd93bb91821a9282eaa46d902a77e738862fb7029964ae2dc9d06e"} Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.502983 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.560492 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9h5s\" (UniqueName: \"kubernetes.io/projected/1d1f649e-07c2-498d-8a63-4e7192c84af1-kube-api-access-r9h5s\") pod \"1d1f649e-07c2-498d-8a63-4e7192c84af1\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.560556 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-swift-storage-0\") pod \"1d1f649e-07c2-498d-8a63-4e7192c84af1\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.560623 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-sb\") pod \"1d1f649e-07c2-498d-8a63-4e7192c84af1\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.560747 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-config\") pod \"1d1f649e-07c2-498d-8a63-4e7192c84af1\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.560777 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-nb\") pod \"1d1f649e-07c2-498d-8a63-4e7192c84af1\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.560802 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-svc\") pod \"1d1f649e-07c2-498d-8a63-4e7192c84af1\" (UID: \"1d1f649e-07c2-498d-8a63-4e7192c84af1\") " Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.569324 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1f649e-07c2-498d-8a63-4e7192c84af1-kube-api-access-r9h5s" (OuterVolumeSpecName: "kube-api-access-r9h5s") pod "1d1f649e-07c2-498d-8a63-4e7192c84af1" (UID: "1d1f649e-07c2-498d-8a63-4e7192c84af1"). InnerVolumeSpecName "kube-api-access-r9h5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.596062 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1d1f649e-07c2-498d-8a63-4e7192c84af1" (UID: "1d1f649e-07c2-498d-8a63-4e7192c84af1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.596720 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1d1f649e-07c2-498d-8a63-4e7192c84af1" (UID: "1d1f649e-07c2-498d-8a63-4e7192c84af1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.596820 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1d1f649e-07c2-498d-8a63-4e7192c84af1" (UID: "1d1f649e-07c2-498d-8a63-4e7192c84af1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.611823 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-config" (OuterVolumeSpecName: "config") pod "1d1f649e-07c2-498d-8a63-4e7192c84af1" (UID: "1d1f649e-07c2-498d-8a63-4e7192c84af1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.612354 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1d1f649e-07c2-498d-8a63-4e7192c84af1" (UID: "1d1f649e-07c2-498d-8a63-4e7192c84af1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.662842 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.662880 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.662889 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.662898 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9h5s\" (UniqueName: \"kubernetes.io/projected/1d1f649e-07c2-498d-8a63-4e7192c84af1-kube-api-access-r9h5s\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.662908 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.662916 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d1f649e-07c2-498d-8a63-4e7192c84af1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.994087 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" event={"ID":"1d1f649e-07c2-498d-8a63-4e7192c84af1","Type":"ContainerDied","Data":"c5d912df9b700f0797ef3bc6756ff88927fa1e0a56f6853d287d549ffaa40fdf"} Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.994147 4788 scope.go:117] "RemoveContainer" containerID="98a19927eaddf0c0bbc54011b39b0626fab2e12aaeb247343c51c026524db5ce" Feb 19 09:01:43 crc kubenswrapper[4788]: I0219 09:01:43.994103 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-vt8wd" Feb 19 09:01:44 crc kubenswrapper[4788]: I0219 09:01:44.000597 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" event={"ID":"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1","Type":"ContainerStarted","Data":"3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873"} Feb 19 09:01:44 crc kubenswrapper[4788]: I0219 09:01:44.001014 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:44 crc kubenswrapper[4788]: I0219 09:01:44.031538 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" podStartSLOduration=4.031522061 podStartE2EDuration="4.031522061s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:01:44.02688849 +0000 UTC m=+1006.014899982" watchObservedRunningTime="2026-02-19 09:01:44.031522061 +0000 UTC m=+1006.019533533" Feb 19 09:01:44 crc kubenswrapper[4788]: I0219 09:01:44.129051 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-vt8wd"] Feb 19 09:01:44 crc kubenswrapper[4788]: I0219 09:01:44.135897 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-vt8wd"] Feb 19 09:01:44 crc kubenswrapper[4788]: E0219 09:01:44.415069 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1f649e_07c2_498d_8a63_4e7192c84af1.slice/crio-c5d912df9b700f0797ef3bc6756ff88927fa1e0a56f6853d287d549ffaa40fdf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1f649e_07c2_498d_8a63_4e7192c84af1.slice\": RecentStats: unable to find data in memory cache]" Feb 19 09:01:44 crc kubenswrapper[4788]: I0219 09:01:44.735620 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d1f649e-07c2-498d-8a63-4e7192c84af1" path="/var/lib/kubelet/pods/1d1f649e-07c2-498d-8a63-4e7192c84af1/volumes" Feb 19 09:01:45 crc kubenswrapper[4788]: I0219 09:01:45.028823 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d","Type":"ContainerStarted","Data":"1d67a535c5510dba392680700aade293a70034a4c1c2e21abf4602d7e4db9b8f"} Feb 19 09:01:45 crc kubenswrapper[4788]: I0219 09:01:45.037874 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ac74889-bac3-455a-83aa-b668300ad25d","Type":"ContainerStarted","Data":"bf1707176075de2cfacb78f4ada18a897a4c4ee99dd832e1049d697924e53b32"} Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.051657 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ac74889-bac3-455a-83aa-b668300ad25d","Type":"ContainerStarted","Data":"ac24178c4f372ac66cb0a1b903863cbf834b5734890c5ecdce5f854d2b6fd1bf"} Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.051996 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" containerName="glance-log" containerID="cri-o://bf1707176075de2cfacb78f4ada18a897a4c4ee99dd832e1049d697924e53b32" gracePeriod=30 Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.052322 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" containerName="glance-httpd" containerID="cri-o://ac24178c4f372ac66cb0a1b903863cbf834b5734890c5ecdce5f854d2b6fd1bf" gracePeriod=30 Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.056386 4788 generic.go:334] "Generic (PLEG): container finished" podID="2d236290-fb64-4f36-9c80-1b7c9d74cca4" containerID="d4da65a43c21e2c1ef2ae88da20296925ef9afa26feac2315d52eecff708f338" exitCode=0 Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.056467 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vprwn" event={"ID":"2d236290-fb64-4f36-9c80-1b7c9d74cca4","Type":"ContainerDied","Data":"d4da65a43c21e2c1ef2ae88da20296925ef9afa26feac2315d52eecff708f338"} Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.065864 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d","Type":"ContainerStarted","Data":"a2adc2c592da7c03d4993047af44ec91a35d00e84050304376edbf2c0f363a49"} Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.066019 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerName="glance-log" containerID="cri-o://1d67a535c5510dba392680700aade293a70034a4c1c2e21abf4602d7e4db9b8f" gracePeriod=30 Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.066340 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerName="glance-httpd" containerID="cri-o://a2adc2c592da7c03d4993047af44ec91a35d00e84050304376edbf2c0f363a49" gracePeriod=30 Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.092612 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.092589097 podStartE2EDuration="6.092589097s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:01:46.070406404 +0000 UTC m=+1008.058417876" watchObservedRunningTime="2026-02-19 09:01:46.092589097 +0000 UTC m=+1008.080600569" Feb 19 09:01:46 crc kubenswrapper[4788]: I0219 09:01:46.106413 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.106386478 podStartE2EDuration="6.106386478s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:01:46.102147966 +0000 UTC m=+1008.090159438" watchObservedRunningTime="2026-02-19 09:01:46.106386478 +0000 UTC m=+1008.094397950" Feb 19 09:01:47 crc kubenswrapper[4788]: I0219 09:01:47.202514 4788 generic.go:334] "Generic (PLEG): container finished" podID="6ac74889-bac3-455a-83aa-b668300ad25d" containerID="ac24178c4f372ac66cb0a1b903863cbf834b5734890c5ecdce5f854d2b6fd1bf" exitCode=0 Feb 19 09:01:47 crc kubenswrapper[4788]: I0219 09:01:47.202760 4788 generic.go:334] "Generic (PLEG): container finished" podID="6ac74889-bac3-455a-83aa-b668300ad25d" containerID="bf1707176075de2cfacb78f4ada18a897a4c4ee99dd832e1049d697924e53b32" exitCode=143 Feb 19 09:01:47 crc kubenswrapper[4788]: I0219 09:01:47.202825 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ac74889-bac3-455a-83aa-b668300ad25d","Type":"ContainerDied","Data":"ac24178c4f372ac66cb0a1b903863cbf834b5734890c5ecdce5f854d2b6fd1bf"} Feb 19 09:01:47 crc kubenswrapper[4788]: I0219 09:01:47.202851 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ac74889-bac3-455a-83aa-b668300ad25d","Type":"ContainerDied","Data":"bf1707176075de2cfacb78f4ada18a897a4c4ee99dd832e1049d697924e53b32"} Feb 19 09:01:47 crc kubenswrapper[4788]: I0219 09:01:47.207404 4788 generic.go:334] "Generic (PLEG): container finished" podID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerID="a2adc2c592da7c03d4993047af44ec91a35d00e84050304376edbf2c0f363a49" exitCode=0 Feb 19 09:01:47 crc kubenswrapper[4788]: I0219 09:01:47.207430 4788 generic.go:334] "Generic (PLEG): container finished" podID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerID="1d67a535c5510dba392680700aade293a70034a4c1c2e21abf4602d7e4db9b8f" exitCode=143 Feb 19 09:01:47 crc kubenswrapper[4788]: I0219 09:01:47.207600 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d","Type":"ContainerDied","Data":"a2adc2c592da7c03d4993047af44ec91a35d00e84050304376edbf2c0f363a49"} Feb 19 09:01:47 crc kubenswrapper[4788]: I0219 09:01:47.207625 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d","Type":"ContainerDied","Data":"1d67a535c5510dba392680700aade293a70034a4c1c2e21abf4602d7e4db9b8f"} Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.359655 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.405569 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rq8l\" (UniqueName: \"kubernetes.io/projected/2d236290-fb64-4f36-9c80-1b7c9d74cca4-kube-api-access-6rq8l\") pod \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.405814 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-fernet-keys\") pod \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.405916 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-config-data\") pod \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.405983 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-combined-ca-bundle\") pod \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.406018 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-credential-keys\") pod \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.406057 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-scripts\") pod \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\" (UID: \"2d236290-fb64-4f36-9c80-1b7c9d74cca4\") " Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.415197 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2d236290-fb64-4f36-9c80-1b7c9d74cca4" (UID: "2d236290-fb64-4f36-9c80-1b7c9d74cca4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.415915 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-scripts" (OuterVolumeSpecName: "scripts") pod "2d236290-fb64-4f36-9c80-1b7c9d74cca4" (UID: "2d236290-fb64-4f36-9c80-1b7c9d74cca4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.417240 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d236290-fb64-4f36-9c80-1b7c9d74cca4-kube-api-access-6rq8l" (OuterVolumeSpecName: "kube-api-access-6rq8l") pod "2d236290-fb64-4f36-9c80-1b7c9d74cca4" (UID: "2d236290-fb64-4f36-9c80-1b7c9d74cca4"). InnerVolumeSpecName "kube-api-access-6rq8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.418680 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2d236290-fb64-4f36-9c80-1b7c9d74cca4" (UID: "2d236290-fb64-4f36-9c80-1b7c9d74cca4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.436210 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-config-data" (OuterVolumeSpecName: "config-data") pod "2d236290-fb64-4f36-9c80-1b7c9d74cca4" (UID: "2d236290-fb64-4f36-9c80-1b7c9d74cca4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.447436 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d236290-fb64-4f36-9c80-1b7c9d74cca4" (UID: "2d236290-fb64-4f36-9c80-1b7c9d74cca4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.508024 4788 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.508063 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.508076 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.508116 4788 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.508130 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d236290-fb64-4f36-9c80-1b7c9d74cca4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:50 crc kubenswrapper[4788]: I0219 09:01:50.508141 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rq8l\" (UniqueName: \"kubernetes.io/projected/2d236290-fb64-4f36-9c80-1b7c9d74cca4-kube-api-access-6rq8l\") on node \"crc\" DevicePath \"\"" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.146392 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.207955 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-jdnlc"] Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.208329 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="dnsmasq-dns" containerID="cri-o://18867f1c93616ceb740fa72d23c4d9b44f08f9f4c708b15efaf4a883fef242a7" gracePeriod=10 Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.255351 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vprwn" event={"ID":"2d236290-fb64-4f36-9c80-1b7c9d74cca4","Type":"ContainerDied","Data":"ae90054f68659cf839baf08db8b9f8430fe7d745b94ca799c61a4f766bebd0d9"} Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.255647 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae90054f68659cf839baf08db8b9f8430fe7d745b94ca799c61a4f766bebd0d9" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.255534 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vprwn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.501780 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vprwn"] Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.508701 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vprwn"] Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.591763 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-p5cvn"] Feb 19 09:01:51 crc kubenswrapper[4788]: E0219 09:01:51.592180 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1f649e-07c2-498d-8a63-4e7192c84af1" containerName="init" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.592203 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1f649e-07c2-498d-8a63-4e7192c84af1" containerName="init" Feb 19 09:01:51 crc kubenswrapper[4788]: E0219 09:01:51.592224 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d236290-fb64-4f36-9c80-1b7c9d74cca4" containerName="keystone-bootstrap" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.592233 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d236290-fb64-4f36-9c80-1b7c9d74cca4" containerName="keystone-bootstrap" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.592515 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d236290-fb64-4f36-9c80-1b7c9d74cca4" containerName="keystone-bootstrap" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.592550 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1f649e-07c2-498d-8a63-4e7192c84af1" containerName="init" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.593104 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.599285 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.599502 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rzps4" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.599662 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.599702 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.599870 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.610354 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p5cvn"] Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.728073 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-config-data\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.728154 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-combined-ca-bundle\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.728216 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-fernet-keys\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.728267 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcdq8\" (UniqueName: \"kubernetes.io/projected/1bb605f6-3946-4a0b-b492-6b011811ec43-kube-api-access-vcdq8\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.728293 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-credential-keys\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.728358 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-scripts\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.830495 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcdq8\" (UniqueName: \"kubernetes.io/projected/1bb605f6-3946-4a0b-b492-6b011811ec43-kube-api-access-vcdq8\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.830573 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-credential-keys\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.830745 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-scripts\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.830872 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-config-data\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.831704 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-combined-ca-bundle\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.831782 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-fernet-keys\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.840133 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-combined-ca-bundle\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.840273 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-config-data\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.840295 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-credential-keys\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.844761 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-scripts\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.851583 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcdq8\" (UniqueName: \"kubernetes.io/projected/1bb605f6-3946-4a0b-b492-6b011811ec43-kube-api-access-vcdq8\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.853064 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-fernet-keys\") pod \"keystone-bootstrap-p5cvn\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:51 crc kubenswrapper[4788]: I0219 09:01:51.921112 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:01:52 crc kubenswrapper[4788]: I0219 09:01:52.264525 4788 generic.go:334] "Generic (PLEG): container finished" podID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerID="18867f1c93616ceb740fa72d23c4d9b44f08f9f4c708b15efaf4a883fef242a7" exitCode=0 Feb 19 09:01:52 crc kubenswrapper[4788]: I0219 09:01:52.264574 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" event={"ID":"fe6d6323-f486-46f4-86c4-2e69ad36b0ad","Type":"ContainerDied","Data":"18867f1c93616ceb740fa72d23c4d9b44f08f9f4c708b15efaf4a883fef242a7"} Feb 19 09:01:52 crc kubenswrapper[4788]: I0219 09:01:52.725526 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d236290-fb64-4f36-9c80-1b7c9d74cca4" path="/var/lib/kubelet/pods/2d236290-fb64-4f36-9c80-1b7c9d74cca4/volumes" Feb 19 09:01:53 crc kubenswrapper[4788]: I0219 09:01:53.229882 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Feb 19 09:01:58 crc kubenswrapper[4788]: I0219 09:01:58.229854 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Feb 19 09:01:58 crc kubenswrapper[4788]: E0219 09:01:58.742161 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Feb 19 09:01:58 crc kubenswrapper[4788]: E0219 09:01:58.742365 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb7h67bh5d6h697h598h655h569h547h645h67h669h56fhf5h66h5b4h65chdfh589hc5h667h655hbch584h97h69h94h586h99h4hb6h675h564q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r69hc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(604aa3ed-40d6-437a-93f3-0e7a445b862b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:02:01 crc kubenswrapper[4788]: E0219 09:02:01.471172 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 19 09:02:01 crc kubenswrapper[4788]: E0219 09:02:01.471616 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhg86,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wllnd_openstack(145312e4-8a69-4c17-964b-2183e2ff66b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:02:01 crc kubenswrapper[4788]: E0219 09:02:01.472768 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wllnd" podUID="145312e4-8a69-4c17-964b-2183e2ff66b4" Feb 19 09:02:02 crc kubenswrapper[4788]: E0219 09:02:02.366572 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-wllnd" podUID="145312e4-8a69-4c17-964b-2183e2ff66b4" Feb 19 09:02:03 crc kubenswrapper[4788]: I0219 09:02:03.229849 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Feb 19 09:02:03 crc kubenswrapper[4788]: I0219 09:02:03.230091 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:02:03 crc kubenswrapper[4788]: I0219 09:02:03.377385 4788 generic.go:334] "Generic (PLEG): container finished" podID="528e3c62-47f6-4cf1-8b32-06dd6657c9f6" containerID="dd13df32eb25b537aa018234d33c9f9669b9d9298b92023335532fb4b817d8b8" exitCode=0 Feb 19 09:02:03 crc kubenswrapper[4788]: I0219 09:02:03.377512 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ps9px" event={"ID":"528e3c62-47f6-4cf1-8b32-06dd6657c9f6","Type":"ContainerDied","Data":"dd13df32eb25b537aa018234d33c9f9669b9d9298b92023335532fb4b817d8b8"} Feb 19 09:02:04 crc kubenswrapper[4788]: E0219 09:02:04.429010 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Feb 19 09:02:04 crc kubenswrapper[4788]: E0219 09:02:04.429177 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4mhwm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-2dgzz_openstack(c7fed6a4-4d87-463f-84d2-942c28422b8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:02:04 crc kubenswrapper[4788]: E0219 09:02:04.430338 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-2dgzz" podUID="c7fed6a4-4d87-463f-84d2-942c28422b8b" Feb 19 09:02:05 crc kubenswrapper[4788]: E0219 09:02:05.401262 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-2dgzz" podUID="c7fed6a4-4d87-463f-84d2-942c28422b8b" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.221966 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.229959 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.235839 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ps9px" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.366790 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-public-tls-certs\") pod \"6ac74889-bac3-455a-83aa-b668300ad25d\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.366841 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-httpd-run\") pod \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.366912 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-combined-ca-bundle\") pod \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.366965 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-internal-tls-certs\") pod \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.366989 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-combined-ca-bundle\") pod \"6ac74889-bac3-455a-83aa-b668300ad25d\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367014 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-config\") pod \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367051 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-httpd-run\") pod \"6ac74889-bac3-455a-83aa-b668300ad25d\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367076 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-config-data\") pod \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367095 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367135 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlvln\" (UniqueName: \"kubernetes.io/projected/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-kube-api-access-xlvln\") pod \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367156 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-config-data\") pod \"6ac74889-bac3-455a-83aa-b668300ad25d\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367175 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6ac74889-bac3-455a-83aa-b668300ad25d\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367209 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-scripts\") pod \"6ac74889-bac3-455a-83aa-b668300ad25d\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367234 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrqvk\" (UniqueName: \"kubernetes.io/projected/6ac74889-bac3-455a-83aa-b668300ad25d-kube-api-access-lrqvk\") pod \"6ac74889-bac3-455a-83aa-b668300ad25d\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367295 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-logs\") pod \"6ac74889-bac3-455a-83aa-b668300ad25d\" (UID: \"6ac74889-bac3-455a-83aa-b668300ad25d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367322 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-scripts\") pod \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367348 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-combined-ca-bundle\") pod \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367382 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-logs\") pod \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\" (UID: \"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367413 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjf97\" (UniqueName: \"kubernetes.io/projected/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-kube-api-access-vjf97\") pod \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\" (UID: \"528e3c62-47f6-4cf1-8b32-06dd6657c9f6\") " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.367831 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ac74889-bac3-455a-83aa-b668300ad25d" (UID: "6ac74889-bac3-455a-83aa-b668300ad25d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.368222 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" (UID: "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.368502 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-logs" (OuterVolumeSpecName: "logs") pod "6ac74889-bac3-455a-83aa-b668300ad25d" (UID: "6ac74889-bac3-455a-83aa-b668300ad25d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.370158 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-logs" (OuterVolumeSpecName: "logs") pod "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" (UID: "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.443643 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ac74889-bac3-455a-83aa-b668300ad25d","Type":"ContainerDied","Data":"0518abee2cdd92ed8369e88f4931394f0898a622c45e4fea0e8d768665d23a5c"} Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.444037 4788 scope.go:117] "RemoveContainer" containerID="ac24178c4f372ac66cb0a1b903863cbf834b5734890c5ecdce5f854d2b6fd1bf" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.444187 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.462428 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d","Type":"ContainerDied","Data":"8669e5c262d13b089341d2cc0ac2ab40a499efccfe24d55360a82795a371152c"} Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.462505 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.696990 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac74889-bac3-455a-83aa-b668300ad25d-kube-api-access-lrqvk" (OuterVolumeSpecName: "kube-api-access-lrqvk") pod "6ac74889-bac3-455a-83aa-b668300ad25d" (UID: "6ac74889-bac3-455a-83aa-b668300ad25d"). InnerVolumeSpecName "kube-api-access-lrqvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.697087 4788 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.697117 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ac74889-bac3-455a-83aa-b668300ad25d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.697129 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.697187 4788 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.697323 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-kube-api-access-xlvln" (OuterVolumeSpecName: "kube-api-access-xlvln") pod "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" (UID: "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d"). InnerVolumeSpecName "kube-api-access-xlvln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.697395 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ps9px" event={"ID":"528e3c62-47f6-4cf1-8b32-06dd6657c9f6","Type":"ContainerDied","Data":"a807277345138cf730d5feb3906024ffe536e19ffb481911a0415cca1c29a95e"} Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.697448 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a807277345138cf730d5feb3906024ffe536e19ffb481911a0415cca1c29a95e" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.697665 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ps9px" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.699761 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-scripts" (OuterVolumeSpecName: "scripts") pod "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" (UID: "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.703177 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "6ac74889-bac3-455a-83aa-b668300ad25d" (UID: "6ac74889-bac3-455a-83aa-b668300ad25d"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.707725 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-kube-api-access-vjf97" (OuterVolumeSpecName: "kube-api-access-vjf97") pod "528e3c62-47f6-4cf1-8b32-06dd6657c9f6" (UID: "528e3c62-47f6-4cf1-8b32-06dd6657c9f6"). InnerVolumeSpecName "kube-api-access-vjf97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.710365 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-scripts" (OuterVolumeSpecName: "scripts") pod "6ac74889-bac3-455a-83aa-b668300ad25d" (UID: "6ac74889-bac3-455a-83aa-b668300ad25d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.735347 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" (UID: "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.742921 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" (UID: "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.742952 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "528e3c62-47f6-4cf1-8b32-06dd6657c9f6" (UID: "528e3c62-47f6-4cf1-8b32-06dd6657c9f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.743170 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-config" (OuterVolumeSpecName: "config") pod "528e3c62-47f6-4cf1-8b32-06dd6657c9f6" (UID: "528e3c62-47f6-4cf1-8b32-06dd6657c9f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.743465 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac74889-bac3-455a-83aa-b668300ad25d" (UID: "6ac74889-bac3-455a-83aa-b668300ad25d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.746651 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-config-data" (OuterVolumeSpecName: "config-data") pod "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" (UID: "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.747737 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-config-data" (OuterVolumeSpecName: "config-data") pod "6ac74889-bac3-455a-83aa-b668300ad25d" (UID: "6ac74889-bac3-455a-83aa-b668300ad25d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.749553 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ac74889-bac3-455a-83aa-b668300ad25d" (UID: "6ac74889-bac3-455a-83aa-b668300ad25d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.763171 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" (UID: "9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.798926 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.798995 4788 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799013 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799030 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799050 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799082 4788 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799100 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799118 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlvln\" (UniqueName: \"kubernetes.io/projected/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-kube-api-access-xlvln\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799144 4788 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799159 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799175 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrqvk\" (UniqueName: \"kubernetes.io/projected/6ac74889-bac3-455a-83aa-b668300ad25d-kube-api-access-lrqvk\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799190 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799206 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799222 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjf97\" (UniqueName: \"kubernetes.io/projected/528e3c62-47f6-4cf1-8b32-06dd6657c9f6-kube-api-access-vjf97\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.799238 4788 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ac74889-bac3-455a-83aa-b668300ad25d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.825220 4788 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.827969 4788 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.901738 4788 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:09 crc kubenswrapper[4788]: I0219 09:02:09.901774 4788 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.074628 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.085174 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.101355 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:02:10 crc kubenswrapper[4788]: E0219 09:02:10.101914 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528e3c62-47f6-4cf1-8b32-06dd6657c9f6" containerName="neutron-db-sync" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.101986 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="528e3c62-47f6-4cf1-8b32-06dd6657c9f6" containerName="neutron-db-sync" Feb 19 09:02:10 crc kubenswrapper[4788]: E0219 09:02:10.102041 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerName="glance-log" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102088 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerName="glance-log" Feb 19 09:02:10 crc kubenswrapper[4788]: E0219 09:02:10.102138 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" containerName="glance-log" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102186 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" containerName="glance-log" Feb 19 09:02:10 crc kubenswrapper[4788]: E0219 09:02:10.102253 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" containerName="glance-httpd" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102310 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" containerName="glance-httpd" Feb 19 09:02:10 crc kubenswrapper[4788]: E0219 09:02:10.102384 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerName="glance-httpd" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102434 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerName="glance-httpd" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102658 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" containerName="glance-log" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102756 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="528e3c62-47f6-4cf1-8b32-06dd6657c9f6" containerName="neutron-db-sync" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102819 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" containerName="glance-httpd" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102906 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerName="glance-log" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.102983 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" containerName="glance-httpd" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.104099 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.107216 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.107574 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9pt5q" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.107801 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.108372 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.111546 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.114831 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.159937 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.161515 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.167328 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.167760 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.170768 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.177838 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.210163 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.210281 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.210306 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-logs\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.210326 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-config-data\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.210358 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-scripts\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.210383 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cl2z\" (UniqueName: \"kubernetes.io/projected/13e91365-d18b-4977-9292-91b3f98f8469-kube-api-access-2cl2z\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.210420 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.210447 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312264 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312331 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312357 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-logs\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312384 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-config-data\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312416 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312445 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312466 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-scripts\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312510 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cl2z\" (UniqueName: \"kubernetes.io/projected/13e91365-d18b-4977-9292-91b3f98f8469-kube-api-access-2cl2z\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312530 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wpbh\" (UniqueName: \"kubernetes.io/projected/855949a4-e027-44b4-8705-202c74c3ffdb-kube-api-access-7wpbh\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312574 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312606 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312624 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312648 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312687 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312708 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-logs\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312735 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.312918 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.315015 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-logs\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.319463 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-config-data\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.321504 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-scripts\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.323091 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.325299 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.328427 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.337971 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cl2z\" (UniqueName: \"kubernetes.io/projected/13e91365-d18b-4977-9292-91b3f98f8469-kube-api-access-2cl2z\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.346147 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.414523 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.414630 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wpbh\" (UniqueName: \"kubernetes.io/projected/855949a4-e027-44b4-8705-202c74c3ffdb-kube-api-access-7wpbh\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.414708 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.414758 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.414781 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-logs\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.414810 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.414846 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.414889 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.418841 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.422270 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.426303 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.426602 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.440768 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-logs\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.445043 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.445472 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.458841 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wpbh\" (UniqueName: \"kubernetes.io/projected/855949a4-e027-44b4-8705-202c74c3ffdb-kube-api-access-7wpbh\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.490630 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.553765 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.575637 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tlwft"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.576945 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.612994 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tlwft"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.722297 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.722349 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.722393 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n2p6\" (UniqueName: \"kubernetes.io/projected/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-kube-api-access-9n2p6\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.722422 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.722508 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.722536 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-config\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.748015 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac74889-bac3-455a-83aa-b668300ad25d" path="/var/lib/kubelet/pods/6ac74889-bac3-455a-83aa-b668300ad25d/volumes" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.749118 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d" path="/var/lib/kubelet/pods/9a2a6d54-4f1c-49aa-8ae2-8fa920dacb9d/volumes" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.749870 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c6b98d796-z866s"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.752350 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.754633 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.755882 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.756123 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zm5vb" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.756284 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.761435 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c6b98d796-z866s"] Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.803468 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824453 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n2p6\" (UniqueName: \"kubernetes.io/projected/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-kube-api-access-9n2p6\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824511 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-combined-ca-bundle\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824553 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824622 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbjv\" (UniqueName: \"kubernetes.io/projected/b4c17192-39ae-4520-aed0-f6325410b6b6-kube-api-access-grbjv\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824653 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824672 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-config\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824708 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-ovndb-tls-certs\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824728 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-httpd-config\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824770 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-config\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824801 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.824836 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.825756 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.826736 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-svc\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.827858 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.828499 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-config\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.829213 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.845601 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n2p6\" (UniqueName: \"kubernetes.io/projected/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-kube-api-access-9n2p6\") pod \"dnsmasq-dns-55f844cf75-tlwft\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.903426 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.926703 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-ovndb-tls-certs\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.926747 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-httpd-config\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.926803 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-config\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.926874 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-combined-ca-bundle\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.926942 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grbjv\" (UniqueName: \"kubernetes.io/projected/b4c17192-39ae-4520-aed0-f6325410b6b6-kube-api-access-grbjv\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.931169 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-ovndb-tls-certs\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.931925 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-combined-ca-bundle\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.932029 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-config\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.932533 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-httpd-config\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:10 crc kubenswrapper[4788]: I0219 09:02:10.943488 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbjv\" (UniqueName: \"kubernetes.io/projected/b4c17192-39ae-4520-aed0-f6325410b6b6-kube-api-access-grbjv\") pod \"neutron-6c6b98d796-z866s\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:11 crc kubenswrapper[4788]: I0219 09:02:11.082783 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:11 crc kubenswrapper[4788]: E0219 09:02:11.215130 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 19 09:02:11 crc kubenswrapper[4788]: E0219 09:02:11.215367 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctgvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-8lpgx_openstack(c2a3acb8-146c-47c0-9218-81cd2728edf9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:02:11 crc kubenswrapper[4788]: E0219 09:02:11.217462 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-8lpgx" podUID="c2a3acb8-146c-47c0-9218-81cd2728edf9" Feb 19 09:02:11 crc kubenswrapper[4788]: I0219 09:02:11.656642 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.406467 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6klg\" (UniqueName: \"kubernetes.io/projected/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-kube-api-access-q6klg\") pod \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.406539 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-sb\") pod \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.406705 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-config\") pod \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.406771 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-nb\") pod \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.406879 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-svc\") pod \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.406920 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-swift-storage-0\") pod \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\" (UID: \"fe6d6323-f486-46f4-86c4-2e69ad36b0ad\") " Feb 19 09:02:12 crc kubenswrapper[4788]: E0219 09:02:12.446491 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Feb 19 09:02:12 crc kubenswrapper[4788]: E0219 09:02:12.480951 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xn525,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-wf626_openstack(708e9c03-709f-4846-bf0f-abb71c9e164f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.479376 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-kube-api-access-q6klg" (OuterVolumeSpecName: "kube-api-access-q6klg") pod "fe6d6323-f486-46f4-86c4-2e69ad36b0ad" (UID: "fe6d6323-f486-46f4-86c4-2e69ad36b0ad"). InnerVolumeSpecName "kube-api-access-q6klg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:12 crc kubenswrapper[4788]: E0219 09:02:12.487062 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-wf626" podUID="708e9c03-709f-4846-bf0f-abb71c9e164f" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.487600 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.487759 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" event={"ID":"fe6d6323-f486-46f4-86c4-2e69ad36b0ad","Type":"ContainerDied","Data":"9d2177cef06c0cbadefa8ebba90ff822e6d788a5414dd46335c73b3abe8605fe"} Feb 19 09:02:12 crc kubenswrapper[4788]: E0219 09:02:12.489222 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-8lpgx" podUID="c2a3acb8-146c-47c0-9218-81cd2728edf9" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.509971 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6klg\" (UniqueName: \"kubernetes.io/projected/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-kube-api-access-q6klg\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.514651 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe6d6323-f486-46f4-86c4-2e69ad36b0ad" (UID: "fe6d6323-f486-46f4-86c4-2e69ad36b0ad"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.519405 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe6d6323-f486-46f4-86c4-2e69ad36b0ad" (UID: "fe6d6323-f486-46f4-86c4-2e69ad36b0ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.532677 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe6d6323-f486-46f4-86c4-2e69ad36b0ad" (UID: "fe6d6323-f486-46f4-86c4-2e69ad36b0ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.549926 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe6d6323-f486-46f4-86c4-2e69ad36b0ad" (UID: "fe6d6323-f486-46f4-86c4-2e69ad36b0ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.563931 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-config" (OuterVolumeSpecName: "config") pod "fe6d6323-f486-46f4-86c4-2e69ad36b0ad" (UID: "fe6d6323-f486-46f4-86c4-2e69ad36b0ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.611956 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.611996 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.612009 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.612020 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.612030 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe6d6323-f486-46f4-86c4-2e69ad36b0ad-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.925891 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-jdnlc"] Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.935138 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-jdnlc"] Feb 19 09:02:12 crc kubenswrapper[4788]: I0219 09:02:12.943234 4788 scope.go:117] "RemoveContainer" containerID="bf1707176075de2cfacb78f4ada18a897a4c4ee99dd832e1049d697924e53b32" Feb 19 09:02:13 crc kubenswrapper[4788]: I0219 09:02:13.230584 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-jdnlc" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Feb 19 09:02:13 crc kubenswrapper[4788]: I0219 09:02:13.270001 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p5cvn"] Feb 19 09:02:13 crc kubenswrapper[4788]: I0219 09:02:13.452915 4788 scope.go:117] "RemoveContainer" containerID="a2adc2c592da7c03d4993047af44ec91a35d00e84050304376edbf2c0f363a49" Feb 19 09:02:13 crc kubenswrapper[4788]: W0219 09:02:13.460473 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bb605f6_3946_4a0b_b492_6b011811ec43.slice/crio-aa2faf74342d1089b20c634716fd63a9f9e338ed33e13a48b376bb1283c04258 WatchSource:0}: Error finding container aa2faf74342d1089b20c634716fd63a9f9e338ed33e13a48b376bb1283c04258: Status 404 returned error can't find the container with id aa2faf74342d1089b20c634716fd63a9f9e338ed33e13a48b376bb1283c04258 Feb 19 09:02:13 crc kubenswrapper[4788]: I0219 09:02:13.512525 4788 scope.go:117] "RemoveContainer" containerID="1d67a535c5510dba392680700aade293a70034a4c1c2e21abf4602d7e4db9b8f" Feb 19 09:02:13 crc kubenswrapper[4788]: I0219 09:02:13.552719 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 19 09:02:13 crc kubenswrapper[4788]: I0219 09:02:13.563413 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p5cvn" event={"ID":"1bb605f6-3946-4a0b-b492-6b011811ec43","Type":"ContainerStarted","Data":"aa2faf74342d1089b20c634716fd63a9f9e338ed33e13a48b376bb1283c04258"} Feb 19 09:02:14 crc kubenswrapper[4788]: E0219 09:02:14.919513 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-wf626" podUID="708e9c03-709f-4846-bf0f-abb71c9e164f" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.953825 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" path="/var/lib/kubelet/pods/fe6d6323-f486-46f4-86c4-2e69ad36b0ad/volumes" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.958232 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-79f86df99f-fntgk"] Feb 19 09:02:14 crc kubenswrapper[4788]: E0219 09:02:14.958571 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="init" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.958583 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="init" Feb 19 09:02:14 crc kubenswrapper[4788]: E0219 09:02:14.958607 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="dnsmasq-dns" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.958614 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="dnsmasq-dns" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.958784 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6d6323-f486-46f4-86c4-2e69ad36b0ad" containerName="dnsmasq-dns" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.962229 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.968464 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79f86df99f-fntgk"] Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.968901 4788 scope.go:117] "RemoveContainer" containerID="18867f1c93616ceb740fa72d23c4d9b44f08f9f4c708b15efaf4a883fef242a7" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.969132 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 19 09:02:14 crc kubenswrapper[4788]: I0219 09:02:14.969184 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.138219 4788 scope.go:117] "RemoveContainer" containerID="a1eb30227ff2aaa4f06e26503102b3e993fce23a450bb6311df3abbea081c9d9" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.139682 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-httpd-config\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.139732 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-internal-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.139756 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-config\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.139813 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-combined-ca-bundle\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.139838 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-ovndb-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.139879 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-public-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.139904 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5zrk\" (UniqueName: \"kubernetes.io/projected/72d8f24f-5e9f-480b-8f17-178d59ebc51d-kube-api-access-q5zrk\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.241741 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-public-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.241814 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5zrk\" (UniqueName: \"kubernetes.io/projected/72d8f24f-5e9f-480b-8f17-178d59ebc51d-kube-api-access-q5zrk\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.241877 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-httpd-config\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.241923 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-internal-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.241956 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-config\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.242095 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-combined-ca-bundle\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.242129 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-ovndb-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.248794 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-public-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.251925 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-combined-ca-bundle\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.253889 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-ovndb-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.256444 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-config\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.262836 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-internal-tls-certs\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.263996 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-httpd-config\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.266683 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5zrk\" (UniqueName: \"kubernetes.io/projected/72d8f24f-5e9f-480b-8f17-178d59ebc51d-kube-api-access-q5zrk\") pod \"neutron-79f86df99f-fntgk\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.296486 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.500539 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.557562 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tlwft"] Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.598263 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.912980 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c6b98d796-z866s"] Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.967750 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" event={"ID":"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5","Type":"ContainerStarted","Data":"5a902fa7009d3b217fa0e5e1e4a731510888bcabb4f31d46ca30d712bfa51526"} Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.969540 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"855949a4-e027-44b4-8705-202c74c3ffdb","Type":"ContainerStarted","Data":"16f2e88691f0e23b8e6b98c9f4ca5a54b0afac0165ff2c3be7ecec7b737c8e06"} Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.971687 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13e91365-d18b-4977-9292-91b3f98f8469","Type":"ContainerStarted","Data":"8b02d85966c2a35277ddfe31c0eca403fd8d92c188bf71748b66a40913185ef5"} Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.975067 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b98d796-z866s" event={"ID":"b4c17192-39ae-4520-aed0-f6325410b6b6","Type":"ContainerStarted","Data":"60ae340d34eeab7c9131ac2a059c463b12af83909f747d143ee86e7f0f5d835d"} Feb 19 09:02:15 crc kubenswrapper[4788]: I0219 09:02:15.991914 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-79f86df99f-fntgk"] Feb 19 09:02:16 crc kubenswrapper[4788]: I0219 09:02:16.984958 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f86df99f-fntgk" event={"ID":"72d8f24f-5e9f-480b-8f17-178d59ebc51d","Type":"ContainerStarted","Data":"bba8c2f89f3776ac73fd386e96efe04fe37b1ab85a377fd382206020bb86797f"} Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.585873 4788 generic.go:334] "Generic (PLEG): container finished" podID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" containerID="b4f34c29986d5b69dea0638c001f1f96d6e7e3fcb4c32c043eae0572802c4112" exitCode=0 Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.586038 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" event={"ID":"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5","Type":"ContainerDied","Data":"b4f34c29986d5b69dea0638c001f1f96d6e7e3fcb4c32c043eae0572802c4112"} Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.593814 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"855949a4-e027-44b4-8705-202c74c3ffdb","Type":"ContainerStarted","Data":"6946745df3ff4febdf38cfda55b437589b22c84ad790a5b24c86aa802cad8ddf"} Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.596744 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f86df99f-fntgk" event={"ID":"72d8f24f-5e9f-480b-8f17-178d59ebc51d","Type":"ContainerStarted","Data":"5c62216ae4db61c5ade08749fc3bc7572732577d80b0dcf9393ae1dcf8fef31c"} Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.599097 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"604aa3ed-40d6-437a-93f3-0e7a445b862b","Type":"ContainerStarted","Data":"eec4936a279633822f5eec8d3f3413a25656e7bb6fb479f1dbb584f10dd3cb00"} Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.602270 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p5cvn" event={"ID":"1bb605f6-3946-4a0b-b492-6b011811ec43","Type":"ContainerStarted","Data":"6f37c7d43101773aee7e9873708bce865b6a29baa2a68fbbe0db46bcb69bdbe4"} Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.604565 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13e91365-d18b-4977-9292-91b3f98f8469","Type":"ContainerStarted","Data":"44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5"} Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.609983 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b98d796-z866s" event={"ID":"b4c17192-39ae-4520-aed0-f6325410b6b6","Type":"ContainerStarted","Data":"5286348298ae9e5284ff477c31c3d706f166c1dcd16b4e678a3f30e44509763c"} Feb 19 09:02:21 crc kubenswrapper[4788]: I0219 09:02:21.633203 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-p5cvn" podStartSLOduration=30.633188811 podStartE2EDuration="30.633188811s" podCreationTimestamp="2026-02-19 09:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:21.63065828 +0000 UTC m=+1043.618669762" watchObservedRunningTime="2026-02-19 09:02:21.633188811 +0000 UTC m=+1043.621200283" Feb 19 09:02:22 crc kubenswrapper[4788]: I0219 09:02:22.628155 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f86df99f-fntgk" event={"ID":"72d8f24f-5e9f-480b-8f17-178d59ebc51d","Type":"ContainerStarted","Data":"8150d321c659059affed0871779c3db7c71e68fe07696885a063990924a72086"} Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.640828 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"855949a4-e027-44b4-8705-202c74c3ffdb","Type":"ContainerStarted","Data":"84ffdaa68b4ec177038b46443924be2446a2c21907fb89afa57c772f8ed8be25"} Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.644676 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13e91365-d18b-4977-9292-91b3f98f8469","Type":"ContainerStarted","Data":"06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37"} Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.647206 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b98d796-z866s" event={"ID":"b4c17192-39ae-4520-aed0-f6325410b6b6","Type":"ContainerStarted","Data":"2644ab20c671b994a5162da9472b9b4f0b23642d7e291858db5645285635078c"} Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.647468 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.649500 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wllnd" event={"ID":"145312e4-8a69-4c17-964b-2183e2ff66b4","Type":"ContainerStarted","Data":"62e0209a0aafc547777ddc5ebb626e3bf3ed3823d504a8e54df4338e120db24a"} Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.652278 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" event={"ID":"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5","Type":"ContainerStarted","Data":"b819790621b6d390d42c9c13a55c404d84091c6c5b57d7c2249b0cada1303a2a"} Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.652320 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.652333 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.671824 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.671807368 podStartE2EDuration="13.671807368s" podCreationTimestamp="2026-02-19 09:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:23.667069554 +0000 UTC m=+1045.655081026" watchObservedRunningTime="2026-02-19 09:02:23.671807368 +0000 UTC m=+1045.659818840" Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.704407 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.70437393 podStartE2EDuration="13.70437393s" podCreationTimestamp="2026-02-19 09:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:23.692604717 +0000 UTC m=+1045.680616189" watchObservedRunningTime="2026-02-19 09:02:23.70437393 +0000 UTC m=+1045.692385402" Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.719942 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-79f86df99f-fntgk" podStartSLOduration=10.719915033 podStartE2EDuration="10.719915033s" podCreationTimestamp="2026-02-19 09:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:23.713494479 +0000 UTC m=+1045.701505951" watchObservedRunningTime="2026-02-19 09:02:23.719915033 +0000 UTC m=+1045.707926495" Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.768946 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c6b98d796-z866s" podStartSLOduration=13.76891854 podStartE2EDuration="13.76891854s" podCreationTimestamp="2026-02-19 09:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:23.740632691 +0000 UTC m=+1045.728644163" watchObservedRunningTime="2026-02-19 09:02:23.76891854 +0000 UTC m=+1045.756930012" Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.769749 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" podStartSLOduration=13.7697445 podStartE2EDuration="13.7697445s" podCreationTimestamp="2026-02-19 09:02:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:23.759105824 +0000 UTC m=+1045.747117306" watchObservedRunningTime="2026-02-19 09:02:23.7697445 +0000 UTC m=+1045.757755972" Feb 19 09:02:23 crc kubenswrapper[4788]: I0219 09:02:23.775596 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wllnd" podStartSLOduration=5.06874952 podStartE2EDuration="43.7755635s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="2026-02-19 09:01:42.00313115 +0000 UTC m=+1003.991142622" lastFinishedPulling="2026-02-19 09:02:20.70994512 +0000 UTC m=+1042.697956602" observedRunningTime="2026-02-19 09:02:23.773941031 +0000 UTC m=+1045.761952503" watchObservedRunningTime="2026-02-19 09:02:23.7755635 +0000 UTC m=+1045.763574972" Feb 19 09:02:27 crc kubenswrapper[4788]: I0219 09:02:27.687716 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2dgzz" event={"ID":"c7fed6a4-4d87-463f-84d2-942c28422b8b","Type":"ContainerStarted","Data":"586d43b2216b6b3a2959b5d5b920474f8cbfddf9badec5646f683200069c85d3"} Feb 19 09:02:27 crc kubenswrapper[4788]: I0219 09:02:27.707829 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2dgzz" podStartSLOduration=4.04350522 podStartE2EDuration="47.707812551s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="2026-02-19 09:01:42.092682171 +0000 UTC m=+1004.080693643" lastFinishedPulling="2026-02-19 09:02:25.756989492 +0000 UTC m=+1047.745000974" observedRunningTime="2026-02-19 09:02:27.703710953 +0000 UTC m=+1049.691722435" watchObservedRunningTime="2026-02-19 09:02:27.707812551 +0000 UTC m=+1049.695824023" Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.270704 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8lpgx" event={"ID":"c2a3acb8-146c-47c0-9218-81cd2728edf9","Type":"ContainerStarted","Data":"9b492f113d579d3a32fbe3bf60ec81d340c3244557313d4ea483895235b6e6df"} Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.280013 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wf626" event={"ID":"708e9c03-709f-4846-bf0f-abb71c9e164f","Type":"ContainerStarted","Data":"1d69b08f03a65afdc2102ef07ccfb54e9ca1781bf3905d5c42f892f9453078e1"} Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.283142 4788 generic.go:334] "Generic (PLEG): container finished" podID="1bb605f6-3946-4a0b-b492-6b011811ec43" containerID="6f37c7d43101773aee7e9873708bce865b6a29baa2a68fbbe0db46bcb69bdbe4" exitCode=0 Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.283192 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p5cvn" event={"ID":"1bb605f6-3946-4a0b-b492-6b011811ec43","Type":"ContainerDied","Data":"6f37c7d43101773aee7e9873708bce865b6a29baa2a68fbbe0db46bcb69bdbe4"} Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.293761 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-8lpgx" podStartSLOduration=4.170043957 podStartE2EDuration="50.293547756s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="2026-02-19 09:01:41.779682234 +0000 UTC m=+1003.767693706" lastFinishedPulling="2026-02-19 09:02:27.903186033 +0000 UTC m=+1049.891197505" observedRunningTime="2026-02-19 09:02:30.29077332 +0000 UTC m=+1052.278784782" watchObservedRunningTime="2026-02-19 09:02:30.293547756 +0000 UTC m=+1052.281559228" Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.335480 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-wf626" podStartSLOduration=2.114346366 podStartE2EDuration="50.335457763s" podCreationTimestamp="2026-02-19 09:01:40 +0000 UTC" firstStartedPulling="2026-02-19 09:01:41.305769724 +0000 UTC m=+1003.293781186" lastFinishedPulling="2026-02-19 09:02:29.526881111 +0000 UTC m=+1051.514892583" observedRunningTime="2026-02-19 09:02:30.315879862 +0000 UTC m=+1052.303891334" watchObservedRunningTime="2026-02-19 09:02:30.335457763 +0000 UTC m=+1052.323469235" Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.491834 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.491892 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.536184 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.548314 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.805214 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:30 crc kubenswrapper[4788]: I0219 09:02:30.805726 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.032419 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.032480 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.032516 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.153683 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-z4jjr"] Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.153921 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" podUID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" containerName="dnsmasq-dns" containerID="cri-o://3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873" gracePeriod=10 Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.298944 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.300424 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.300588 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:31 crc kubenswrapper[4788]: I0219 09:02:31.300655 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.073986 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.080319 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.109343 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-scripts\") pod \"1bb605f6-3946-4a0b-b492-6b011811ec43\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.109432 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcdq8\" (UniqueName: \"kubernetes.io/projected/1bb605f6-3946-4a0b-b492-6b011811ec43-kube-api-access-vcdq8\") pod \"1bb605f6-3946-4a0b-b492-6b011811ec43\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.109460 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-credential-keys\") pod \"1bb605f6-3946-4a0b-b492-6b011811ec43\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.109594 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-combined-ca-bundle\") pod \"1bb605f6-3946-4a0b-b492-6b011811ec43\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.109627 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-config-data\") pod \"1bb605f6-3946-4a0b-b492-6b011811ec43\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.109653 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-fernet-keys\") pod \"1bb605f6-3946-4a0b-b492-6b011811ec43\" (UID: \"1bb605f6-3946-4a0b-b492-6b011811ec43\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.121072 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1bb605f6-3946-4a0b-b492-6b011811ec43" (UID: "1bb605f6-3946-4a0b-b492-6b011811ec43"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.134987 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb605f6-3946-4a0b-b492-6b011811ec43-kube-api-access-vcdq8" (OuterVolumeSpecName: "kube-api-access-vcdq8") pod "1bb605f6-3946-4a0b-b492-6b011811ec43" (UID: "1bb605f6-3946-4a0b-b492-6b011811ec43"). InnerVolumeSpecName "kube-api-access-vcdq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.137611 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-scripts" (OuterVolumeSpecName: "scripts") pod "1bb605f6-3946-4a0b-b492-6b011811ec43" (UID: "1bb605f6-3946-4a0b-b492-6b011811ec43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.153908 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1bb605f6-3946-4a0b-b492-6b011811ec43" (UID: "1bb605f6-3946-4a0b-b492-6b011811ec43"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.159436 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-config-data" (OuterVolumeSpecName: "config-data") pod "1bb605f6-3946-4a0b-b492-6b011811ec43" (UID: "1bb605f6-3946-4a0b-b492-6b011811ec43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.175369 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bb605f6-3946-4a0b-b492-6b011811ec43" (UID: "1bb605f6-3946-4a0b-b492-6b011811ec43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.210843 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-swift-storage-0\") pod \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.210911 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-nb\") pod \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.210937 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94j7h\" (UniqueName: \"kubernetes.io/projected/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-kube-api-access-94j7h\") pod \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211028 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-config\") pod \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211049 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-svc\") pod \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211094 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-sb\") pod \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\" (UID: \"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1\") " Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211437 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211451 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcdq8\" (UniqueName: \"kubernetes.io/projected/1bb605f6-3946-4a0b-b492-6b011811ec43-kube-api-access-vcdq8\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211461 4788 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211472 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211479 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.211486 4788 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1bb605f6-3946-4a0b-b492-6b011811ec43-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.240959 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-kube-api-access-94j7h" (OuterVolumeSpecName: "kube-api-access-94j7h") pod "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" (UID: "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1"). InnerVolumeSpecName "kube-api-access-94j7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.267438 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" (UID: "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.267745 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" (UID: "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.269697 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" (UID: "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.273911 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" (UID: "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.280982 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-config" (OuterVolumeSpecName: "config") pod "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" (UID: "7e6a3b3f-85b6-4492-95cb-5834f14fe5b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.310095 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p5cvn" event={"ID":"1bb605f6-3946-4a0b-b492-6b011811ec43","Type":"ContainerDied","Data":"aa2faf74342d1089b20c634716fd63a9f9e338ed33e13a48b376bb1283c04258"} Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.310160 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa2faf74342d1089b20c634716fd63a9f9e338ed33e13a48b376bb1283c04258" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.310177 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p5cvn" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.312217 4788 generic.go:334] "Generic (PLEG): container finished" podID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" containerID="3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873" exitCode=0 Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.312624 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.312650 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.312663 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.312736 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.313139 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" event={"ID":"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1","Type":"ContainerDied","Data":"3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873"} Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.313170 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-z4jjr" event={"ID":"7e6a3b3f-85b6-4492-95cb-5834f14fe5b1","Type":"ContainerDied","Data":"a8c4331829dd93bb91821a9282eaa46d902a77e738862fb7029964ae2dc9d06e"} Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.313190 4788 scope.go:117] "RemoveContainer" containerID="3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.314054 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94j7h\" (UniqueName: \"kubernetes.io/projected/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-kube-api-access-94j7h\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.314581 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.315094 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.421919 4788 scope.go:117] "RemoveContainer" containerID="0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.443233 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-748f6c7c59-q59qb"] Feb 19 09:02:32 crc kubenswrapper[4788]: E0219 09:02:32.444087 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" containerName="init" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.444113 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" containerName="init" Feb 19 09:02:32 crc kubenswrapper[4788]: E0219 09:02:32.444122 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb605f6-3946-4a0b-b492-6b011811ec43" containerName="keystone-bootstrap" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.444129 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb605f6-3946-4a0b-b492-6b011811ec43" containerName="keystone-bootstrap" Feb 19 09:02:32 crc kubenswrapper[4788]: E0219 09:02:32.444163 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" containerName="dnsmasq-dns" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.444169 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" containerName="dnsmasq-dns" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.444525 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb605f6-3946-4a0b-b492-6b011811ec43" containerName="keystone-bootstrap" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.444550 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" containerName="dnsmasq-dns" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.446308 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.451043 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rzps4" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.451735 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.458813 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.458857 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.459348 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.460126 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.465277 4788 scope.go:117] "RemoveContainer" containerID="3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.469075 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-748f6c7c59-q59qb"] Feb 19 09:02:32 crc kubenswrapper[4788]: E0219 09:02:32.479164 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873\": container with ID starting with 3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873 not found: ID does not exist" containerID="3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.479205 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873"} err="failed to get container status \"3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873\": rpc error: code = NotFound desc = could not find container \"3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873\": container with ID starting with 3750beb349c0270eb9e63e54ccc23cd736193ad9c93ce2ff3d9d14940f729873 not found: ID does not exist" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.479231 4788 scope.go:117] "RemoveContainer" containerID="0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6" Feb 19 09:02:32 crc kubenswrapper[4788]: E0219 09:02:32.479652 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6\": container with ID starting with 0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6 not found: ID does not exist" containerID="0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.479671 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6"} err="failed to get container status \"0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6\": rpc error: code = NotFound desc = could not find container \"0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6\": container with ID starting with 0627956e9217fd75ce8ee5fcbbf83d62f58212a44d51398d4d58253e76aefed6 not found: ID does not exist" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.488740 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-z4jjr"] Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.498235 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-z4jjr"] Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.629354 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-public-tls-certs\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.630373 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-combined-ca-bundle\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.630489 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-fernet-keys\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.630526 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-internal-tls-certs\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.630806 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-scripts\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.630958 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-credential-keys\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.630994 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxxzz\" (UniqueName: \"kubernetes.io/projected/fa3fa772-2fba-4d00-993a-f240d053d0a9-kube-api-access-dxxzz\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.631149 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-config-data\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.728852 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6a3b3f-85b6-4492-95cb-5834f14fe5b1" path="/var/lib/kubelet/pods/7e6a3b3f-85b6-4492-95cb-5834f14fe5b1/volumes" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.733525 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-scripts\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.733611 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-credential-keys\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.733639 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxxzz\" (UniqueName: \"kubernetes.io/projected/fa3fa772-2fba-4d00-993a-f240d053d0a9-kube-api-access-dxxzz\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.733683 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-config-data\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.733784 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-public-tls-certs\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.733817 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-combined-ca-bundle\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.733858 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-fernet-keys\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.733882 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-internal-tls-certs\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.737879 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-scripts\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.742764 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-public-tls-certs\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.747695 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-fernet-keys\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.747865 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-internal-tls-certs\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.753422 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-config-data\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.763218 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-credential-keys\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.763798 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxxzz\" (UniqueName: \"kubernetes.io/projected/fa3fa772-2fba-4d00-993a-f240d053d0a9-kube-api-access-dxxzz\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.773131 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa3fa772-2fba-4d00-993a-f240d053d0a9-combined-ca-bundle\") pod \"keystone-748f6c7c59-q59qb\" (UID: \"fa3fa772-2fba-4d00-993a-f240d053d0a9\") " pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:32 crc kubenswrapper[4788]: I0219 09:02:32.798090 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:33 crc kubenswrapper[4788]: I0219 09:02:33.329846 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:02:33 crc kubenswrapper[4788]: I0219 09:02:33.330146 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:02:33 crc kubenswrapper[4788]: I0219 09:02:33.330434 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:02:33 crc kubenswrapper[4788]: I0219 09:02:33.330454 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:02:33 crc kubenswrapper[4788]: I0219 09:02:33.469920 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-748f6c7c59-q59qb"] Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.356534 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-748f6c7c59-q59qb" event={"ID":"fa3fa772-2fba-4d00-993a-f240d053d0a9","Type":"ContainerStarted","Data":"9239d1712c3376ccd10de8989e15b3dd3b63840b0c7426e4b5e56dbf07e9e4d0"} Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.356821 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-748f6c7c59-q59qb" event={"ID":"fa3fa772-2fba-4d00-993a-f240d053d0a9","Type":"ContainerStarted","Data":"7a5e615f5c2dc63af6c8b15cbbfd52a847bf7fcef1d3587a2ba4bb3d624e6753"} Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.356941 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.359596 4788 generic.go:334] "Generic (PLEG): container finished" podID="c7fed6a4-4d87-463f-84d2-942c28422b8b" containerID="586d43b2216b6b3a2959b5d5b920474f8cbfddf9badec5646f683200069c85d3" exitCode=0 Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.359650 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2dgzz" event={"ID":"c7fed6a4-4d87-463f-84d2-942c28422b8b","Type":"ContainerDied","Data":"586d43b2216b6b3a2959b5d5b920474f8cbfddf9badec5646f683200069c85d3"} Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.382912 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-748f6c7c59-q59qb" podStartSLOduration=2.38289006 podStartE2EDuration="2.38289006s" podCreationTimestamp="2026-02-19 09:02:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:34.377515441 +0000 UTC m=+1056.365526923" watchObservedRunningTime="2026-02-19 09:02:34.38289006 +0000 UTC m=+1056.370901532" Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.467308 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.467409 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:02:34 crc kubenswrapper[4788]: I0219 09:02:34.617294 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 09:02:35 crc kubenswrapper[4788]: I0219 09:02:35.184054 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 09:02:35 crc kubenswrapper[4788]: I0219 09:02:35.184134 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:02:35 crc kubenswrapper[4788]: I0219 09:02:35.195807 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 09:02:36 crc kubenswrapper[4788]: I0219 09:02:36.398067 4788 generic.go:334] "Generic (PLEG): container finished" podID="145312e4-8a69-4c17-964b-2183e2ff66b4" containerID="62e0209a0aafc547777ddc5ebb626e3bf3ed3823d504a8e54df4338e120db24a" exitCode=0 Feb 19 09:02:36 crc kubenswrapper[4788]: I0219 09:02:36.398600 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wllnd" event={"ID":"145312e4-8a69-4c17-964b-2183e2ff66b4","Type":"ContainerDied","Data":"62e0209a0aafc547777ddc5ebb626e3bf3ed3823d504a8e54df4338e120db24a"} Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.423058 4788 generic.go:334] "Generic (PLEG): container finished" podID="708e9c03-709f-4846-bf0f-abb71c9e164f" containerID="1d69b08f03a65afdc2102ef07ccfb54e9ca1781bf3905d5c42f892f9453078e1" exitCode=0 Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.423231 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wf626" event={"ID":"708e9c03-709f-4846-bf0f-abb71c9e164f","Type":"ContainerDied","Data":"1d69b08f03a65afdc2102ef07ccfb54e9ca1781bf3905d5c42f892f9453078e1"} Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.510817 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2dgzz" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.649553 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wllnd" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.675436 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mhwm\" (UniqueName: \"kubernetes.io/projected/c7fed6a4-4d87-463f-84d2-942c28422b8b-kube-api-access-4mhwm\") pod \"c7fed6a4-4d87-463f-84d2-942c28422b8b\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.675604 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-config-data\") pod \"c7fed6a4-4d87-463f-84d2-942c28422b8b\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.675653 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7fed6a4-4d87-463f-84d2-942c28422b8b-logs\") pod \"c7fed6a4-4d87-463f-84d2-942c28422b8b\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.675679 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-scripts\") pod \"c7fed6a4-4d87-463f-84d2-942c28422b8b\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.675713 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-combined-ca-bundle\") pod \"c7fed6a4-4d87-463f-84d2-942c28422b8b\" (UID: \"c7fed6a4-4d87-463f-84d2-942c28422b8b\") " Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.676640 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7fed6a4-4d87-463f-84d2-942c28422b8b-logs" (OuterVolumeSpecName: "logs") pod "c7fed6a4-4d87-463f-84d2-942c28422b8b" (UID: "c7fed6a4-4d87-463f-84d2-942c28422b8b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.680990 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7fed6a4-4d87-463f-84d2-942c28422b8b-kube-api-access-4mhwm" (OuterVolumeSpecName: "kube-api-access-4mhwm") pod "c7fed6a4-4d87-463f-84d2-942c28422b8b" (UID: "c7fed6a4-4d87-463f-84d2-942c28422b8b"). InnerVolumeSpecName "kube-api-access-4mhwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.681565 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-scripts" (OuterVolumeSpecName: "scripts") pod "c7fed6a4-4d87-463f-84d2-942c28422b8b" (UID: "c7fed6a4-4d87-463f-84d2-942c28422b8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.718873 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7fed6a4-4d87-463f-84d2-942c28422b8b" (UID: "c7fed6a4-4d87-463f-84d2-942c28422b8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.742514 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-config-data" (OuterVolumeSpecName: "config-data") pod "c7fed6a4-4d87-463f-84d2-942c28422b8b" (UID: "c7fed6a4-4d87-463f-84d2-942c28422b8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.778428 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhg86\" (UniqueName: \"kubernetes.io/projected/145312e4-8a69-4c17-964b-2183e2ff66b4-kube-api-access-zhg86\") pod \"145312e4-8a69-4c17-964b-2183e2ff66b4\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.778664 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-combined-ca-bundle\") pod \"145312e4-8a69-4c17-964b-2183e2ff66b4\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.778806 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-db-sync-config-data\") pod \"145312e4-8a69-4c17-964b-2183e2ff66b4\" (UID: \"145312e4-8a69-4c17-964b-2183e2ff66b4\") " Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.779275 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.779377 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7fed6a4-4d87-463f-84d2-942c28422b8b-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.779457 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.779538 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7fed6a4-4d87-463f-84d2-942c28422b8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.779620 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mhwm\" (UniqueName: \"kubernetes.io/projected/c7fed6a4-4d87-463f-84d2-942c28422b8b-kube-api-access-4mhwm\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.783282 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145312e4-8a69-4c17-964b-2183e2ff66b4-kube-api-access-zhg86" (OuterVolumeSpecName: "kube-api-access-zhg86") pod "145312e4-8a69-4c17-964b-2183e2ff66b4" (UID: "145312e4-8a69-4c17-964b-2183e2ff66b4"). InnerVolumeSpecName "kube-api-access-zhg86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.783820 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "145312e4-8a69-4c17-964b-2183e2ff66b4" (UID: "145312e4-8a69-4c17-964b-2183e2ff66b4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.833238 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "145312e4-8a69-4c17-964b-2183e2ff66b4" (UID: "145312e4-8a69-4c17-964b-2183e2ff66b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.881637 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhg86\" (UniqueName: \"kubernetes.io/projected/145312e4-8a69-4c17-964b-2183e2ff66b4-kube-api-access-zhg86\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.881895 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:38 crc kubenswrapper[4788]: I0219 09:02:38.881958 4788 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/145312e4-8a69-4c17-964b-2183e2ff66b4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.433914 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"604aa3ed-40d6-437a-93f3-0e7a445b862b","Type":"ContainerStarted","Data":"2a5dbb1d57bfd2b196066fab31c23d95bb570bcff4081dec0df1c58832954eb8"} Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.437062 4788 generic.go:334] "Generic (PLEG): container finished" podID="c2a3acb8-146c-47c0-9218-81cd2728edf9" containerID="9b492f113d579d3a32fbe3bf60ec81d340c3244557313d4ea483895235b6e6df" exitCode=0 Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.437166 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8lpgx" event={"ID":"c2a3acb8-146c-47c0-9218-81cd2728edf9","Type":"ContainerDied","Data":"9b492f113d579d3a32fbe3bf60ec81d340c3244557313d4ea483895235b6e6df"} Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.439106 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wllnd" event={"ID":"145312e4-8a69-4c17-964b-2183e2ff66b4","Type":"ContainerDied","Data":"a3903643c6a0b9b17661e85d4080cef7b7bd5b0e0497adfc6a315343ea5f2844"} Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.439145 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3903643c6a0b9b17661e85d4080cef7b7bd5b0e0497adfc6a315343ea5f2844" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.439157 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wllnd" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.442410 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2dgzz" event={"ID":"c7fed6a4-4d87-463f-84d2-942c28422b8b","Type":"ContainerDied","Data":"4e677ffb6919250ddcb8b49a52adeb01dc87ae16c3ce6763a3cb74bfa3a1136f"} Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.442469 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2dgzz" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.442472 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e677ffb6919250ddcb8b49a52adeb01dc87ae16c3ce6763a3cb74bfa3a1136f" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.657030 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-86464574f6-lv4mn"] Feb 19 09:02:39 crc kubenswrapper[4788]: E0219 09:02:39.657670 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145312e4-8a69-4c17-964b-2183e2ff66b4" containerName="barbican-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.657686 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="145312e4-8a69-4c17-964b-2183e2ff66b4" containerName="barbican-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: E0219 09:02:39.657723 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7fed6a4-4d87-463f-84d2-942c28422b8b" containerName="placement-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.657731 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7fed6a4-4d87-463f-84d2-942c28422b8b" containerName="placement-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.657960 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="145312e4-8a69-4c17-964b-2183e2ff66b4" containerName="barbican-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.657983 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7fed6a4-4d87-463f-84d2-942c28422b8b" containerName="placement-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.659285 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.662182 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.662179 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.664209 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.665681 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-gvv2s" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.665864 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.686849 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-86464574f6-lv4mn"] Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.805337 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-combined-ca-bundle\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.805378 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xcfx\" (UniqueName: \"kubernetes.io/projected/9b9de162-c3e4-4ab7-a29d-4dabf60de673-kube-api-access-2xcfx\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.805424 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-internal-tls-certs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.805450 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-scripts\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.805473 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-config-data\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.805575 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-public-tls-certs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.805593 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b9de162-c3e4-4ab7-a29d-4dabf60de673-logs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.831753 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wf626" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.873468 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5565784c67-nzhww"] Feb 19 09:02:39 crc kubenswrapper[4788]: E0219 09:02:39.873916 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708e9c03-709f-4846-bf0f-abb71c9e164f" containerName="heat-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.873932 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="708e9c03-709f-4846-bf0f-abb71c9e164f" containerName="heat-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.874158 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="708e9c03-709f-4846-bf0f-abb71c9e164f" containerName="heat-db-sync" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.875161 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.880296 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.880597 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.880855 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zxtj7" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.886745 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78686fb9d-c7vd2"] Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.888231 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.893571 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.903067 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5565784c67-nzhww"] Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.910514 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-public-tls-certs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.910561 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b9de162-c3e4-4ab7-a29d-4dabf60de673-logs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.910632 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-combined-ca-bundle\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.910655 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xcfx\" (UniqueName: \"kubernetes.io/projected/9b9de162-c3e4-4ab7-a29d-4dabf60de673-kube-api-access-2xcfx\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.910711 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-internal-tls-certs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.910758 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-scripts\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.910798 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-config-data\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.922584 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b9de162-c3e4-4ab7-a29d-4dabf60de673-logs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.934052 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-internal-tls-certs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.936311 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78686fb9d-c7vd2"] Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.942026 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-combined-ca-bundle\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.942888 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-config-data\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.943213 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-public-tls-certs\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.957854 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b9de162-c3e4-4ab7-a29d-4dabf60de673-scripts\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.961206 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xcfx\" (UniqueName: \"kubernetes.io/projected/9b9de162-c3e4-4ab7-a29d-4dabf60de673-kube-api-access-2xcfx\") pod \"placement-86464574f6-lv4mn\" (UID: \"9b9de162-c3e4-4ab7-a29d-4dabf60de673\") " pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.986894 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjn9l"] Feb 19 09:02:39 crc kubenswrapper[4788]: I0219 09:02:39.992558 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.001181 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjn9l"] Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.006870 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014122 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn525\" (UniqueName: \"kubernetes.io/projected/708e9c03-709f-4846-bf0f-abb71c9e164f-kube-api-access-xn525\") pod \"708e9c03-709f-4846-bf0f-abb71c9e164f\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014210 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-config-data\") pod \"708e9c03-709f-4846-bf0f-abb71c9e164f\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014273 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-combined-ca-bundle\") pod \"708e9c03-709f-4846-bf0f-abb71c9e164f\" (UID: \"708e9c03-709f-4846-bf0f-abb71c9e164f\") " Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014631 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a88dc2-18b9-4d55-9930-0c3396063e8b-logs\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014678 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jxsd\" (UniqueName: \"kubernetes.io/projected/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-kube-api-access-8jxsd\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014697 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-combined-ca-bundle\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014726 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-logs\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014744 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-config-data-custom\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014764 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-config-data-custom\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014794 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-config-data\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.014827 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f77p6\" (UniqueName: \"kubernetes.io/projected/c0a88dc2-18b9-4d55-9930-0c3396063e8b-kube-api-access-f77p6\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.017925 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-config-data\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.017970 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-combined-ca-bundle\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.021980 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708e9c03-709f-4846-bf0f-abb71c9e164f-kube-api-access-xn525" (OuterVolumeSpecName: "kube-api-access-xn525") pod "708e9c03-709f-4846-bf0f-abb71c9e164f" (UID: "708e9c03-709f-4846-bf0f-abb71c9e164f"). InnerVolumeSpecName "kube-api-access-xn525". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.050616 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "708e9c03-709f-4846-bf0f-abb71c9e164f" (UID: "708e9c03-709f-4846-bf0f-abb71c9e164f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.105573 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54b6649458-8rbkh"] Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.107014 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.109472 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.117806 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54b6649458-8rbkh"] Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120288 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120349 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jxsd\" (UniqueName: \"kubernetes.io/projected/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-kube-api-access-8jxsd\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120380 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-combined-ca-bundle\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120417 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-logs\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120441 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-config-data-custom\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120460 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120509 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-config-data-custom\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120544 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120573 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-config-data\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120609 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fff7\" (UniqueName: \"kubernetes.io/projected/afd0a9f9-61a6-4380-8a98-2ddb06119202-kube-api-access-6fff7\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120640 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f77p6\" (UniqueName: \"kubernetes.io/projected/c0a88dc2-18b9-4d55-9930-0c3396063e8b-kube-api-access-f77p6\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120695 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-config-data\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120718 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-combined-ca-bundle\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120742 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-config\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120797 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120833 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a88dc2-18b9-4d55-9930-0c3396063e8b-logs\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120899 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn525\" (UniqueName: \"kubernetes.io/projected/708e9c03-709f-4846-bf0f-abb71c9e164f-kube-api-access-xn525\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.120913 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.121344 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0a88dc2-18b9-4d55-9930-0c3396063e8b-logs\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.122018 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-logs\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.129283 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-config-data\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.130626 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-config-data\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.130858 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-combined-ca-bundle\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.154067 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-combined-ca-bundle\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.155619 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-config-data-custom\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.157504 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0a88dc2-18b9-4d55-9930-0c3396063e8b-config-data-custom\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.167481 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jxsd\" (UniqueName: \"kubernetes.io/projected/0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5-kube-api-access-8jxsd\") pod \"barbican-keystone-listener-78686fb9d-c7vd2\" (UID: \"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5\") " pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.169085 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-config-data" (OuterVolumeSpecName: "config-data") pod "708e9c03-709f-4846-bf0f-abb71c9e164f" (UID: "708e9c03-709f-4846-bf0f-abb71c9e164f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.174495 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f77p6\" (UniqueName: \"kubernetes.io/projected/c0a88dc2-18b9-4d55-9930-0c3396063e8b-kube-api-access-f77p6\") pod \"barbican-worker-5565784c67-nzhww\" (UID: \"c0a88dc2-18b9-4d55-9930-0c3396063e8b\") " pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.204151 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5565784c67-nzhww" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.222318 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvm4q\" (UniqueName: \"kubernetes.io/projected/f9e7e6e3-ef8d-4863-a024-61596fa46d51-kube-api-access-mvm4q\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.222364 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.222385 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9e7e6e3-ef8d-4863-a024-61596fa46d51-logs\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.222429 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.222775 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.223322 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.223409 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-svc\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224080 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224189 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224333 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data-custom\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224364 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224390 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fff7\" (UniqueName: \"kubernetes.io/projected/afd0a9f9-61a6-4380-8a98-2ddb06119202-kube-api-access-6fff7\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224395 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224481 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-combined-ca-bundle\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224544 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-config\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224632 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708e9c03-709f-4846-bf0f-abb71c9e164f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.224924 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.225339 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-config\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.240877 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fff7\" (UniqueName: \"kubernetes.io/projected/afd0a9f9-61a6-4380-8a98-2ddb06119202-kube-api-access-6fff7\") pod \"dnsmasq-dns-85ff748b95-xjn9l\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.326712 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data-custom\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.327839 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.328119 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-combined-ca-bundle\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.328332 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvm4q\" (UniqueName: \"kubernetes.io/projected/f9e7e6e3-ef8d-4863-a024-61596fa46d51-kube-api-access-mvm4q\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.328391 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9e7e6e3-ef8d-4863-a024-61596fa46d51-logs\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.329586 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9e7e6e3-ef8d-4863-a024-61596fa46d51-logs\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.332551 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.335291 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-combined-ca-bundle\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.336273 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data-custom\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.344207 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.364216 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvm4q\" (UniqueName: \"kubernetes.io/projected/f9e7e6e3-ef8d-4863-a024-61596fa46d51-kube-api-access-mvm4q\") pod \"barbican-api-54b6649458-8rbkh\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.455374 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wf626" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.455454 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wf626" event={"ID":"708e9c03-709f-4846-bf0f-abb71c9e164f","Type":"ContainerDied","Data":"c9a6caf271aed008f9b9370b0f59d811690c08352baddb40c5e6826f7b5a7d7f"} Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.455481 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9a6caf271aed008f9b9370b0f59d811690c08352baddb40c5e6826f7b5a7d7f" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.476390 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.580679 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-86464574f6-lv4mn"] Feb 19 09:02:40 crc kubenswrapper[4788]: W0219 09:02:40.693215 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b9de162_c3e4_4ab7_a29d_4dabf60de673.slice/crio-288ed00ce042f9f5217c6f09da85cf463478ab372113ea9a64fcc1173facef05 WatchSource:0}: Error finding container 288ed00ce042f9f5217c6f09da85cf463478ab372113ea9a64fcc1173facef05: Status 404 returned error can't find the container with id 288ed00ce042f9f5217c6f09da85cf463478ab372113ea9a64fcc1173facef05 Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.752976 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5565784c67-nzhww"] Feb 19 09:02:40 crc kubenswrapper[4788]: I0219 09:02:40.843816 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78686fb9d-c7vd2"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.068035 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.129173 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.162808 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-db-sync-config-data\") pod \"c2a3acb8-146c-47c0-9218-81cd2728edf9\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.162934 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-scripts\") pod \"c2a3acb8-146c-47c0-9218-81cd2728edf9\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.162970 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctgvz\" (UniqueName: \"kubernetes.io/projected/c2a3acb8-146c-47c0-9218-81cd2728edf9-kube-api-access-ctgvz\") pod \"c2a3acb8-146c-47c0-9218-81cd2728edf9\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.162987 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-combined-ca-bundle\") pod \"c2a3acb8-146c-47c0-9218-81cd2728edf9\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.163017 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2a3acb8-146c-47c0-9218-81cd2728edf9-etc-machine-id\") pod \"c2a3acb8-146c-47c0-9218-81cd2728edf9\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.163095 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-config-data\") pod \"c2a3acb8-146c-47c0-9218-81cd2728edf9\" (UID: \"c2a3acb8-146c-47c0-9218-81cd2728edf9\") " Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.168327 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c2a3acb8-146c-47c0-9218-81cd2728edf9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c2a3acb8-146c-47c0-9218-81cd2728edf9" (UID: "c2a3acb8-146c-47c0-9218-81cd2728edf9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.175305 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjn9l"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.177237 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-scripts" (OuterVolumeSpecName: "scripts") pod "c2a3acb8-146c-47c0-9218-81cd2728edf9" (UID: "c2a3acb8-146c-47c0-9218-81cd2728edf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.186551 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c2a3acb8-146c-47c0-9218-81cd2728edf9" (UID: "c2a3acb8-146c-47c0-9218-81cd2728edf9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.186615 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a3acb8-146c-47c0-9218-81cd2728edf9-kube-api-access-ctgvz" (OuterVolumeSpecName: "kube-api-access-ctgvz") pod "c2a3acb8-146c-47c0-9218-81cd2728edf9" (UID: "c2a3acb8-146c-47c0-9218-81cd2728edf9"). InnerVolumeSpecName "kube-api-access-ctgvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.217616 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2a3acb8-146c-47c0-9218-81cd2728edf9" (UID: "c2a3acb8-146c-47c0-9218-81cd2728edf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.251908 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-config-data" (OuterVolumeSpecName: "config-data") pod "c2a3acb8-146c-47c0-9218-81cd2728edf9" (UID: "c2a3acb8-146c-47c0-9218-81cd2728edf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.265357 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.265386 4788 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.265402 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.265412 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctgvz\" (UniqueName: \"kubernetes.io/projected/c2a3acb8-146c-47c0-9218-81cd2728edf9-kube-api-access-ctgvz\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.265421 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2a3acb8-146c-47c0-9218-81cd2728edf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.265431 4788 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2a3acb8-146c-47c0-9218-81cd2728edf9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.364831 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54b6649458-8rbkh"] Feb 19 09:02:41 crc kubenswrapper[4788]: W0219 09:02:41.368834 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e7e6e3_ef8d_4863_a024_61596fa46d51.slice/crio-08ea41ff5e49273b9e1172295e2d60cb625da8effe1941616cbdf94c3cffc4ae WatchSource:0}: Error finding container 08ea41ff5e49273b9e1172295e2d60cb625da8effe1941616cbdf94c3cffc4ae: Status 404 returned error can't find the container with id 08ea41ff5e49273b9e1172295e2d60cb625da8effe1941616cbdf94c3cffc4ae Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.461202 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" event={"ID":"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5","Type":"ContainerStarted","Data":"82a02c6d5fc08591117ab3e6018d0e489c8a04e49abddc8737aa432b77050c90"} Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.462951 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5565784c67-nzhww" event={"ID":"c0a88dc2-18b9-4d55-9930-0c3396063e8b","Type":"ContainerStarted","Data":"a6a7492fd1114df4a0fc31601353041c477487bfbed201155fa2cf8391afbdfe"} Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.464155 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" event={"ID":"afd0a9f9-61a6-4380-8a98-2ddb06119202","Type":"ContainerStarted","Data":"c1e0af28bf2133207518f8e0f01c3953dd29e8bb3ec32a32975cd58c83cfb56a"} Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.473879 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b6649458-8rbkh" event={"ID":"f9e7e6e3-ef8d-4863-a024-61596fa46d51","Type":"ContainerStarted","Data":"08ea41ff5e49273b9e1172295e2d60cb625da8effe1941616cbdf94c3cffc4ae"} Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.518455 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86464574f6-lv4mn" event={"ID":"9b9de162-c3e4-4ab7-a29d-4dabf60de673","Type":"ContainerStarted","Data":"bcca3fe385f0eb7a4586c914459dffe0ebecef846cfb9b8d40c014151965a7d2"} Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.518513 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86464574f6-lv4mn" event={"ID":"9b9de162-c3e4-4ab7-a29d-4dabf60de673","Type":"ContainerStarted","Data":"288ed00ce042f9f5217c6f09da85cf463478ab372113ea9a64fcc1173facef05"} Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.521212 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-8lpgx" event={"ID":"c2a3acb8-146c-47c0-9218-81cd2728edf9","Type":"ContainerDied","Data":"b8d12f04b1f95d703245bce0e2aa59c9e26cb7d0a4f80e022d389beb950dd1d8"} Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.521272 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8d12f04b1f95d703245bce0e2aa59c9e26cb7d0a4f80e022d389beb950dd1d8" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.521323 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-8lpgx" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.528774 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79f86df99f-fntgk"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.529072 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79f86df99f-fntgk" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-api" containerID="cri-o://5c62216ae4db61c5ade08749fc3bc7572732577d80b0dcf9393ae1dcf8fef31c" gracePeriod=30 Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.529208 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-79f86df99f-fntgk" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-httpd" containerID="cri-o://8150d321c659059affed0871779c3db7c71e68fe07696885a063990924a72086" gracePeriod=30 Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.570216 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-79f86df99f-fntgk" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.154:9696/\": EOF" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.602339 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-647d45fc9-99jnj"] Feb 19 09:02:41 crc kubenswrapper[4788]: E0219 09:02:41.602833 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a3acb8-146c-47c0-9218-81cd2728edf9" containerName="cinder-db-sync" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.602849 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a3acb8-146c-47c0-9218-81cd2728edf9" containerName="cinder-db-sync" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.603081 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a3acb8-146c-47c0-9218-81cd2728edf9" containerName="cinder-db-sync" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.604165 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.628875 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647d45fc9-99jnj"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.687897 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-config\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.688218 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-combined-ca-bundle\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.688265 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-httpd-config\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.688286 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-ovndb-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.688322 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-internal-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.688401 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45v4h\" (UniqueName: \"kubernetes.io/projected/27632492-f51f-49c6-a63a-d037329d57e9-kube-api-access-45v4h\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.688448 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-public-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.765327 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.766812 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.771193 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.771516 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.771710 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-vg8sl" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.772362 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.790488 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45v4h\" (UniqueName: \"kubernetes.io/projected/27632492-f51f-49c6-a63a-d037329d57e9-kube-api-access-45v4h\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.790570 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-public-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.790674 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-config\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.790703 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-combined-ca-bundle\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.790730 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-httpd-config\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.790748 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-ovndb-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.790795 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-internal-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.800896 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-internal-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.802921 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.819177 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-config\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.822282 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-ovndb-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.824039 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-httpd-config\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.825128 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-combined-ca-bundle\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.842356 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27632492-f51f-49c6-a63a-d037329d57e9-public-tls-certs\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.847790 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjn9l"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.855284 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45v4h\" (UniqueName: \"kubernetes.io/projected/27632492-f51f-49c6-a63a-d037329d57e9-kube-api-access-45v4h\") pod \"neutron-647d45fc9-99jnj\" (UID: \"27632492-f51f-49c6-a63a-d037329d57e9\") " pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.876650 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gmkv7"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.879019 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.891948 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.892027 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-scripts\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.892084 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.892121 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkv7t\" (UniqueName: \"kubernetes.io/projected/183a54b2-c97f-47de-93f6-024d497fffe6-kube-api-access-qkv7t\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.892203 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.892298 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183a54b2-c97f-47de-93f6-024d497fffe6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.900188 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gmkv7"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.952728 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.954016 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.956823 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 09:02:41 crc kubenswrapper[4788]: I0219 09:02:41.961856 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.022361 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023234 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzds6\" (UniqueName: \"kubernetes.io/projected/379f970e-625d-401e-b625-a81a8e19ec02-kube-api-access-rzds6\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023351 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023405 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-config\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023463 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023528 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183a54b2-c97f-47de-93f6-024d497fffe6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023564 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023611 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023643 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-scripts\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023686 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023718 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023755 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkv7t\" (UniqueName: \"kubernetes.io/projected/183a54b2-c97f-47de-93f6-024d497fffe6-kube-api-access-qkv7t\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.023781 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.028751 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183a54b2-c97f-47de-93f6-024d497fffe6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.043133 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.057942 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkv7t\" (UniqueName: \"kubernetes.io/projected/183a54b2-c97f-47de-93f6-024d497fffe6-kube-api-access-qkv7t\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.057997 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.058899 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.059015 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-scripts\") pod \"cinder-scheduler-0\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.124997 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzds6\" (UniqueName: \"kubernetes.io/projected/379f970e-625d-401e-b625-a81a8e19ec02-kube-api-access-rzds6\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125230 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-config\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125304 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125337 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e924ad6-b71b-400f-af06-406e9e2341e4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125387 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125405 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125423 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125447 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125703 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125734 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-scripts\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125783 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc59r\" (UniqueName: \"kubernetes.io/projected/9e924ad6-b71b-400f-af06-406e9e2341e4-kube-api-access-rc59r\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125813 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.125871 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e924ad6-b71b-400f-af06-406e9e2341e4-logs\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.126238 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-config\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.126610 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.126670 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.126832 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.127499 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.151975 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzds6\" (UniqueName: \"kubernetes.io/projected/379f970e-625d-401e-b625-a81a8e19ec02-kube-api-access-rzds6\") pod \"dnsmasq-dns-5c9776ccc5-gmkv7\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.183359 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.224711 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.226917 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e924ad6-b71b-400f-af06-406e9e2341e4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.226997 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.227025 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.227051 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.227090 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-scripts\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.227130 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc59r\" (UniqueName: \"kubernetes.io/projected/9e924ad6-b71b-400f-af06-406e9e2341e4-kube-api-access-rc59r\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.227188 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e924ad6-b71b-400f-af06-406e9e2341e4-logs\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.228521 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e924ad6-b71b-400f-af06-406e9e2341e4-logs\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.231095 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e924ad6-b71b-400f-af06-406e9e2341e4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.232060 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data-custom\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.232132 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.238228 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-scripts\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.242820 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.249470 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc59r\" (UniqueName: \"kubernetes.io/projected/9e924ad6-b71b-400f-af06-406e9e2341e4-kube-api-access-rc59r\") pod \"cinder-api-0\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.295969 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.548570 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b6649458-8rbkh" event={"ID":"f9e7e6e3-ef8d-4863-a024-61596fa46d51","Type":"ContainerStarted","Data":"4ad1b7c830e57d5b6f53c0a0cac3edca8ef3c10808e0aaa87a07eb117c6aa64f"} Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.548825 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b6649458-8rbkh" event={"ID":"f9e7e6e3-ef8d-4863-a024-61596fa46d51","Type":"ContainerStarted","Data":"e03f6721cae8cdf981bb9755146b82c7557357151ac5e734d6dd218b83ffe87e"} Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.550508 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.550532 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.574762 4788 generic.go:334] "Generic (PLEG): container finished" podID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerID="8150d321c659059affed0871779c3db7c71e68fe07696885a063990924a72086" exitCode=0 Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.574849 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f86df99f-fntgk" event={"ID":"72d8f24f-5e9f-480b-8f17-178d59ebc51d","Type":"ContainerDied","Data":"8150d321c659059affed0871779c3db7c71e68fe07696885a063990924a72086"} Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.581551 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54b6649458-8rbkh" podStartSLOduration=2.581533587 podStartE2EDuration="2.581533587s" podCreationTimestamp="2026-02-19 09:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:42.571022344 +0000 UTC m=+1064.559033816" watchObservedRunningTime="2026-02-19 09:02:42.581533587 +0000 UTC m=+1064.569545059" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.588082 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-86464574f6-lv4mn" event={"ID":"9b9de162-c3e4-4ab7-a29d-4dabf60de673","Type":"ContainerStarted","Data":"887973acc5db2be01cd1a2801fa0fe7306715c23edffacffae4d16497d46b98d"} Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.589140 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.589162 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.603857 4788 generic.go:334] "Generic (PLEG): container finished" podID="afd0a9f9-61a6-4380-8a98-2ddb06119202" containerID="9742d9ae490943cf777aceb11e9e14a8208e09c0443932e95ce100d5d5ffecf0" exitCode=0 Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.603900 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" event={"ID":"afd0a9f9-61a6-4380-8a98-2ddb06119202","Type":"ContainerDied","Data":"9742d9ae490943cf777aceb11e9e14a8208e09c0443932e95ce100d5d5ffecf0"} Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.627352 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-86464574f6-lv4mn" podStartSLOduration=3.627331887 podStartE2EDuration="3.627331887s" podCreationTimestamp="2026-02-19 09:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:42.618501625 +0000 UTC m=+1064.606513117" watchObservedRunningTime="2026-02-19 09:02:42.627331887 +0000 UTC m=+1064.615343359" Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.811105 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:02:42 crc kubenswrapper[4788]: I0219 09:02:42.979813 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gmkv7"] Feb 19 09:02:43 crc kubenswrapper[4788]: I0219 09:02:43.370024 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:02:43 crc kubenswrapper[4788]: W0219 09:02:43.399577 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e924ad6_b71b_400f_af06_406e9e2341e4.slice/crio-2faf4cc09189aa3acf1b150c761a85ad8c7cbb7a0686350394512ae843a5935f WatchSource:0}: Error finding container 2faf4cc09189aa3acf1b150c761a85ad8c7cbb7a0686350394512ae843a5935f: Status 404 returned error can't find the container with id 2faf4cc09189aa3acf1b150c761a85ad8c7cbb7a0686350394512ae843a5935f Feb 19 09:02:43 crc kubenswrapper[4788]: I0219 09:02:43.502715 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647d45fc9-99jnj"] Feb 19 09:02:43 crc kubenswrapper[4788]: I0219 09:02:43.614339 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" event={"ID":"379f970e-625d-401e-b625-a81a8e19ec02","Type":"ContainerStarted","Data":"1bf15969d8c0f72ba676d94388367ad28636561e19c17ec01f18672174dbd299"} Feb 19 09:02:43 crc kubenswrapper[4788]: I0219 09:02:43.615496 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e924ad6-b71b-400f-af06-406e9e2341e4","Type":"ContainerStarted","Data":"2faf4cc09189aa3acf1b150c761a85ad8c7cbb7a0686350394512ae843a5935f"} Feb 19 09:02:43 crc kubenswrapper[4788]: I0219 09:02:43.617540 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"183a54b2-c97f-47de-93f6-024d497fffe6","Type":"ContainerStarted","Data":"a725a4a9bb14f4a444b347127fe0dbd274895111573766892463d0f9fef6cef0"} Feb 19 09:02:43 crc kubenswrapper[4788]: E0219 09:02:43.694108 4788 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 19 09:02:43 crc kubenswrapper[4788]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/afd0a9f9-61a6-4380-8a98-2ddb06119202/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 19 09:02:43 crc kubenswrapper[4788]: > podSandboxID="c1e0af28bf2133207518f8e0f01c3953dd29e8bb3ec32a32975cd58c83cfb56a" Feb 19 09:02:43 crc kubenswrapper[4788]: E0219 09:02:43.694338 4788 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 09:02:43 crc kubenswrapper[4788]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch57ch5c5hcch589hf7h577h659h96h5c8h5b4h55fhbbh667h565h5bchcbh58dh7dh5bch586h56ch574h598h67dh5c8h56dh8bh574h564hbch7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6fff7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-85ff748b95-xjn9l_openstack(afd0a9f9-61a6-4380-8a98-2ddb06119202): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/afd0a9f9-61a6-4380-8a98-2ddb06119202/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 19 09:02:43 crc kubenswrapper[4788]: > logger="UnhandledError" Feb 19 09:02:43 crc kubenswrapper[4788]: E0219 09:02:43.699506 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/afd0a9f9-61a6-4380-8a98-2ddb06119202/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" podUID="afd0a9f9-61a6-4380-8a98-2ddb06119202" Feb 19 09:02:44 crc kubenswrapper[4788]: I0219 09:02:44.644697 4788 generic.go:334] "Generic (PLEG): container finished" podID="379f970e-625d-401e-b625-a81a8e19ec02" containerID="0b4d41e04ca5eeaec965bde080d2231eca884954ed95a372a6cdb32b858d63bc" exitCode=0 Feb 19 09:02:44 crc kubenswrapper[4788]: I0219 09:02:44.645195 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" event={"ID":"379f970e-625d-401e-b625-a81a8e19ec02","Type":"ContainerDied","Data":"0b4d41e04ca5eeaec965bde080d2231eca884954ed95a372a6cdb32b858d63bc"} Feb 19 09:02:44 crc kubenswrapper[4788]: I0219 09:02:44.653004 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e924ad6-b71b-400f-af06-406e9e2341e4","Type":"ContainerStarted","Data":"093381a8bb52834eac100e41a38499bde99aaad2811427c5fe6c9df7ab179ef4"} Feb 19 09:02:44 crc kubenswrapper[4788]: I0219 09:02:44.656321 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647d45fc9-99jnj" event={"ID":"27632492-f51f-49c6-a63a-d037329d57e9","Type":"ContainerStarted","Data":"47c6b5303acd39413cdf1d6c83bb49d06ce27cfa20d770d4511cfd3ba694afb8"} Feb 19 09:02:44 crc kubenswrapper[4788]: I0219 09:02:44.656376 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647d45fc9-99jnj" event={"ID":"27632492-f51f-49c6-a63a-d037329d57e9","Type":"ContainerStarted","Data":"dfc8b8d6e69964822cd5252781fd6266d90a7d4841118ccb2f0a74b7d6d37e22"} Feb 19 09:02:44 crc kubenswrapper[4788]: I0219 09:02:44.659272 4788 generic.go:334] "Generic (PLEG): container finished" podID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerID="5c62216ae4db61c5ade08749fc3bc7572732577d80b0dcf9393ae1dcf8fef31c" exitCode=0 Feb 19 09:02:44 crc kubenswrapper[4788]: I0219 09:02:44.660102 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f86df99f-fntgk" event={"ID":"72d8f24f-5e9f-480b-8f17-178d59ebc51d","Type":"ContainerDied","Data":"5c62216ae4db61c5ade08749fc3bc7572732577d80b0dcf9393ae1dcf8fef31c"} Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.236857 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.707413 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5zrk\" (UniqueName: \"kubernetes.io/projected/72d8f24f-5e9f-480b-8f17-178d59ebc51d-kube-api-access-q5zrk\") pod \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.707464 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-ovndb-tls-certs\") pod \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.707532 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-combined-ca-bundle\") pod \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.707626 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-public-tls-certs\") pod \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.707693 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-internal-tls-certs\") pod \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.707775 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-config\") pod \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.707839 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-httpd-config\") pod \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\" (UID: \"72d8f24f-5e9f-480b-8f17-178d59ebc51d\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.713204 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "72d8f24f-5e9f-480b-8f17-178d59ebc51d" (UID: "72d8f24f-5e9f-480b-8f17-178d59ebc51d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.748729 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.810698 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-swift-storage-0\") pod \"afd0a9f9-61a6-4380-8a98-2ddb06119202\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.811078 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-svc\") pod \"afd0a9f9-61a6-4380-8a98-2ddb06119202\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.811098 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fff7\" (UniqueName: \"kubernetes.io/projected/afd0a9f9-61a6-4380-8a98-2ddb06119202-kube-api-access-6fff7\") pod \"afd0a9f9-61a6-4380-8a98-2ddb06119202\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.811134 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-config\") pod \"afd0a9f9-61a6-4380-8a98-2ddb06119202\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.811223 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-sb\") pod \"afd0a9f9-61a6-4380-8a98-2ddb06119202\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.812275 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-nb\") pod \"afd0a9f9-61a6-4380-8a98-2ddb06119202\" (UID: \"afd0a9f9-61a6-4380-8a98-2ddb06119202\") " Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.812681 4788 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.846538 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" event={"ID":"afd0a9f9-61a6-4380-8a98-2ddb06119202","Type":"ContainerDied","Data":"c1e0af28bf2133207518f8e0f01c3953dd29e8bb3ec32a32975cd58c83cfb56a"} Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.846594 4788 scope.go:117] "RemoveContainer" containerID="9742d9ae490943cf777aceb11e9e14a8208e09c0443932e95ce100d5d5ffecf0" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.846743 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-xjn9l" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.855759 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-79f86df99f-fntgk" event={"ID":"72d8f24f-5e9f-480b-8f17-178d59ebc51d","Type":"ContainerDied","Data":"bba8c2f89f3776ac73fd386e96efe04fe37b1ab85a377fd382206020bb86797f"} Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.855861 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-79f86df99f-fntgk" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.856092 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d8f24f-5e9f-480b-8f17-178d59ebc51d-kube-api-access-q5zrk" (OuterVolumeSpecName: "kube-api-access-q5zrk") pod "72d8f24f-5e9f-480b-8f17-178d59ebc51d" (UID: "72d8f24f-5e9f-480b-8f17-178d59ebc51d"). InnerVolumeSpecName "kube-api-access-q5zrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.877955 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd0a9f9-61a6-4380-8a98-2ddb06119202-kube-api-access-6fff7" (OuterVolumeSpecName: "kube-api-access-6fff7") pod "afd0a9f9-61a6-4380-8a98-2ddb06119202" (UID: "afd0a9f9-61a6-4380-8a98-2ddb06119202"). InnerVolumeSpecName "kube-api-access-6fff7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.900750 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "72d8f24f-5e9f-480b-8f17-178d59ebc51d" (UID: "72d8f24f-5e9f-480b-8f17-178d59ebc51d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.917158 4788 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.917186 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5zrk\" (UniqueName: \"kubernetes.io/projected/72d8f24f-5e9f-480b-8f17-178d59ebc51d-kube-api-access-q5zrk\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.917195 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fff7\" (UniqueName: \"kubernetes.io/projected/afd0a9f9-61a6-4380-8a98-2ddb06119202-kube-api-access-6fff7\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.924061 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72d8f24f-5e9f-480b-8f17-178d59ebc51d" (UID: "72d8f24f-5e9f-480b-8f17-178d59ebc51d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.936078 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "72d8f24f-5e9f-480b-8f17-178d59ebc51d" (UID: "72d8f24f-5e9f-480b-8f17-178d59ebc51d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.944707 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-config" (OuterVolumeSpecName: "config") pod "72d8f24f-5e9f-480b-8f17-178d59ebc51d" (UID: "72d8f24f-5e9f-480b-8f17-178d59ebc51d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.961018 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "72d8f24f-5e9f-480b-8f17-178d59ebc51d" (UID: "72d8f24f-5e9f-480b-8f17-178d59ebc51d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.962264 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "afd0a9f9-61a6-4380-8a98-2ddb06119202" (UID: "afd0a9f9-61a6-4380-8a98-2ddb06119202"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.963331 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "afd0a9f9-61a6-4380-8a98-2ddb06119202" (UID: "afd0a9f9-61a6-4380-8a98-2ddb06119202"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.965969 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-config" (OuterVolumeSpecName: "config") pod "afd0a9f9-61a6-4380-8a98-2ddb06119202" (UID: "afd0a9f9-61a6-4380-8a98-2ddb06119202"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.980852 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "afd0a9f9-61a6-4380-8a98-2ddb06119202" (UID: "afd0a9f9-61a6-4380-8a98-2ddb06119202"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:45 crc kubenswrapper[4788]: I0219 09:02:45.982079 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "afd0a9f9-61a6-4380-8a98-2ddb06119202" (UID: "afd0a9f9-61a6-4380-8a98-2ddb06119202"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018312 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018346 4788 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018358 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018374 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018386 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018397 4788 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018407 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018418 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/72d8f24f-5e9f-480b-8f17-178d59ebc51d-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.018425 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afd0a9f9-61a6-4380-8a98-2ddb06119202-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.119788 4788 scope.go:117] "RemoveContainer" containerID="8150d321c659059affed0871779c3db7c71e68fe07696885a063990924a72086" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.174312 4788 scope.go:117] "RemoveContainer" containerID="5c62216ae4db61c5ade08749fc3bc7572732577d80b0dcf9393ae1dcf8fef31c" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.223409 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.253043 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjn9l"] Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.280743 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-xjn9l"] Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.305313 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-79f86df99f-fntgk"] Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.331324 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-79f86df99f-fntgk"] Feb 19 09:02:46 crc kubenswrapper[4788]: E0219 09:02:46.480986 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d8f24f_5e9f_480b_8f17_178d59ebc51d.slice/crio-bba8c2f89f3776ac73fd386e96efe04fe37b1ab85a377fd382206020bb86797f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd0a9f9_61a6_4380_8a98_2ddb06119202.slice/crio-c1e0af28bf2133207518f8e0f01c3953dd29e8bb3ec32a32975cd58c83cfb56a\": RecentStats: unable to find data in memory cache]" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.745566 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" path="/var/lib/kubelet/pods/72d8f24f-5e9f-480b-8f17-178d59ebc51d/volumes" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.746169 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd0a9f9-61a6-4380-8a98-2ddb06119202" path="/var/lib/kubelet/pods/afd0a9f9-61a6-4380-8a98-2ddb06119202/volumes" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.872322 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647d45fc9-99jnj" event={"ID":"27632492-f51f-49c6-a63a-d037329d57e9","Type":"ContainerStarted","Data":"555e88832e22a725ab2a2e00ca1210f7871ce91b79c05ec4faa6f513c913a592"} Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.872655 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.883426 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" event={"ID":"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5","Type":"ContainerStarted","Data":"5ba02fdbf151ca682f814e7f04d5b209fc506de5dd22e628255c95246edb2ee2"} Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.883476 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" event={"ID":"0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5","Type":"ContainerStarted","Data":"e39cef2a9e846b2f507b90c76e8ef2578422c87de4d63b1bb70ed3d0497a92dc"} Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.893428 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5565784c67-nzhww" event={"ID":"c0a88dc2-18b9-4d55-9930-0c3396063e8b","Type":"ContainerStarted","Data":"3d49af0867bcd142b1c9e17dfd428eabdc1d68b6511e165a3828fa90ae7ddd08"} Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.893473 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5565784c67-nzhww" event={"ID":"c0a88dc2-18b9-4d55-9930-0c3396063e8b","Type":"ContainerStarted","Data":"c627afb77283e53bb8925036690a6c042770dd285375be0325ab023e7efebb1a"} Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.915356 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-647d45fc9-99jnj" podStartSLOduration=5.915338322 podStartE2EDuration="5.915338322s" podCreationTimestamp="2026-02-19 09:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:46.899763858 +0000 UTC m=+1068.887775330" watchObservedRunningTime="2026-02-19 09:02:46.915338322 +0000 UTC m=+1068.903349794" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.928047 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5565784c67-nzhww" podStartSLOduration=3.623103645 podStartE2EDuration="7.928024596s" podCreationTimestamp="2026-02-19 09:02:39 +0000 UTC" firstStartedPulling="2026-02-19 09:02:40.789715798 +0000 UTC m=+1062.777727260" lastFinishedPulling="2026-02-19 09:02:45.094636739 +0000 UTC m=+1067.082648211" observedRunningTime="2026-02-19 09:02:46.927452592 +0000 UTC m=+1068.915464064" watchObservedRunningTime="2026-02-19 09:02:46.928024596 +0000 UTC m=+1068.916036068" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.932687 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" event={"ID":"379f970e-625d-401e-b625-a81a8e19ec02","Type":"ContainerStarted","Data":"502a0091c25fe8629946739276942cf44b25e6393de5a151cf8436937dd87611"} Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.933252 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.969767 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78686fb9d-c7vd2" podStartSLOduration=3.853613751 podStartE2EDuration="7.969746958s" podCreationTimestamp="2026-02-19 09:02:39 +0000 UTC" firstStartedPulling="2026-02-19 09:02:40.944509875 +0000 UTC m=+1062.932521347" lastFinishedPulling="2026-02-19 09:02:45.060643082 +0000 UTC m=+1067.048654554" observedRunningTime="2026-02-19 09:02:46.951168352 +0000 UTC m=+1068.939179824" watchObservedRunningTime="2026-02-19 09:02:46.969746958 +0000 UTC m=+1068.957758430" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.981562 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e924ad6-b71b-400f-af06-406e9e2341e4","Type":"ContainerStarted","Data":"dbde9939c2b7c2c5f8b1a1d7b364a0ded140ebe391ea29a5be3ed919873994cd"} Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.981972 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api-log" containerID="cri-o://093381a8bb52834eac100e41a38499bde99aaad2811427c5fe6c9df7ab179ef4" gracePeriod=30 Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.982282 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api" containerID="cri-o://dbde9939c2b7c2c5f8b1a1d7b364a0ded140ebe391ea29a5be3ed919873994cd" gracePeriod=30 Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.982302 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 09:02:46 crc kubenswrapper[4788]: I0219 09:02:46.995901 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"183a54b2-c97f-47de-93f6-024d497fffe6","Type":"ContainerStarted","Data":"d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459"} Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.022206 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" podStartSLOduration=6.022184678 podStartE2EDuration="6.022184678s" podCreationTimestamp="2026-02-19 09:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:47.011146312 +0000 UTC m=+1068.999157794" watchObservedRunningTime="2026-02-19 09:02:47.022184678 +0000 UTC m=+1069.010196150" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.060458 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.060434386 podStartE2EDuration="6.060434386s" podCreationTimestamp="2026-02-19 09:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:47.041212144 +0000 UTC m=+1069.029223626" watchObservedRunningTime="2026-02-19 09:02:47.060434386 +0000 UTC m=+1069.048445858" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.628506 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56db6bc974-wjvtx"] Feb 19 09:02:47 crc kubenswrapper[4788]: E0219 09:02:47.629237 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-api" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.629279 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-api" Feb 19 09:02:47 crc kubenswrapper[4788]: E0219 09:02:47.629300 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-httpd" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.629306 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-httpd" Feb 19 09:02:47 crc kubenswrapper[4788]: E0219 09:02:47.629319 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd0a9f9-61a6-4380-8a98-2ddb06119202" containerName="init" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.629326 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd0a9f9-61a6-4380-8a98-2ddb06119202" containerName="init" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.629578 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd0a9f9-61a6-4380-8a98-2ddb06119202" containerName="init" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.629611 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-httpd" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.629627 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d8f24f-5e9f-480b-8f17-178d59ebc51d" containerName="neutron-api" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.630602 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.633842 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.644513 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.647363 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56db6bc974-wjvtx"] Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.714385 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-config-data\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.714493 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-internal-tls-certs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.714572 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-public-tls-certs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.714601 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-config-data-custom\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.714630 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-combined-ca-bundle\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.714684 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9czl6\" (UniqueName: \"kubernetes.io/projected/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-kube-api-access-9czl6\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.714726 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-logs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.816178 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-config-data\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.816302 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-internal-tls-certs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.816359 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-public-tls-certs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.816382 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-config-data-custom\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.816409 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-combined-ca-bundle\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.816465 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9czl6\" (UniqueName: \"kubernetes.io/projected/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-kube-api-access-9czl6\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.816500 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-logs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.816978 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-logs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.822778 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-combined-ca-bundle\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.823565 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-config-data\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.825708 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-config-data-custom\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.825940 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-public-tls-certs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.826106 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-internal-tls-certs\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.841236 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9czl6\" (UniqueName: \"kubernetes.io/projected/817ea3c2-ff5a-475b-a6cd-295c84c9d02c-kube-api-access-9czl6\") pod \"barbican-api-56db6bc974-wjvtx\" (UID: \"817ea3c2-ff5a-475b-a6cd-295c84c9d02c\") " pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:47 crc kubenswrapper[4788]: I0219 09:02:47.949912 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:48 crc kubenswrapper[4788]: I0219 09:02:48.020363 4788 generic.go:334] "Generic (PLEG): container finished" podID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerID="093381a8bb52834eac100e41a38499bde99aaad2811427c5fe6c9df7ab179ef4" exitCode=143 Feb 19 09:02:48 crc kubenswrapper[4788]: I0219 09:02:48.020467 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e924ad6-b71b-400f-af06-406e9e2341e4","Type":"ContainerDied","Data":"093381a8bb52834eac100e41a38499bde99aaad2811427c5fe6c9df7ab179ef4"} Feb 19 09:02:48 crc kubenswrapper[4788]: I0219 09:02:48.030916 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"183a54b2-c97f-47de-93f6-024d497fffe6","Type":"ContainerStarted","Data":"bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b"} Feb 19 09:02:48 crc kubenswrapper[4788]: I0219 09:02:48.544925 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.302367091 podStartE2EDuration="7.544901485s" podCreationTimestamp="2026-02-19 09:02:41 +0000 UTC" firstStartedPulling="2026-02-19 09:02:42.817670068 +0000 UTC m=+1064.805681570" lastFinishedPulling="2026-02-19 09:02:45.060204492 +0000 UTC m=+1067.048215964" observedRunningTime="2026-02-19 09:02:48.068620618 +0000 UTC m=+1070.056632090" watchObservedRunningTime="2026-02-19 09:02:48.544901485 +0000 UTC m=+1070.532912967" Feb 19 09:02:48 crc kubenswrapper[4788]: I0219 09:02:48.560582 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56db6bc974-wjvtx"] Feb 19 09:02:51 crc kubenswrapper[4788]: W0219 09:02:51.774650 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod817ea3c2_ff5a_475b_a6cd_295c84c9d02c.slice/crio-1595ae0c603651f14ab1d424c88e65fe1c665d3989eea42ba3b44611f5388b28 WatchSource:0}: Error finding container 1595ae0c603651f14ab1d424c88e65fe1c665d3989eea42ba3b44611f5388b28: Status 404 returned error can't find the container with id 1595ae0c603651f14ab1d424c88e65fe1c665d3989eea42ba3b44611f5388b28 Feb 19 09:02:52 crc kubenswrapper[4788]: I0219 09:02:52.006193 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:52 crc kubenswrapper[4788]: I0219 09:02:52.069334 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:02:52 crc kubenswrapper[4788]: I0219 09:02:52.078701 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56db6bc974-wjvtx" event={"ID":"817ea3c2-ff5a-475b-a6cd-295c84c9d02c","Type":"ContainerStarted","Data":"1595ae0c603651f14ab1d424c88e65fe1c665d3989eea42ba3b44611f5388b28"} Feb 19 09:02:52 crc kubenswrapper[4788]: I0219 09:02:52.184469 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 09:02:52 crc kubenswrapper[4788]: I0219 09:02:52.230058 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:02:52 crc kubenswrapper[4788]: I0219 09:02:52.298826 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tlwft"] Feb 19 09:02:52 crc kubenswrapper[4788]: I0219 09:02:52.299118 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" podUID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" containerName="dnsmasq-dns" containerID="cri-o://b819790621b6d390d42c9c13a55c404d84091c6c5b57d7c2249b0cada1303a2a" gracePeriod=10 Feb 19 09:02:52 crc kubenswrapper[4788]: I0219 09:02:52.469209 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 09:02:53 crc kubenswrapper[4788]: I0219 09:02:53.091076 4788 generic.go:334] "Generic (PLEG): container finished" podID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" containerID="b819790621b6d390d42c9c13a55c404d84091c6c5b57d7c2249b0cada1303a2a" exitCode=0 Feb 19 09:02:53 crc kubenswrapper[4788]: I0219 09:02:53.091164 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" event={"ID":"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5","Type":"ContainerDied","Data":"b819790621b6d390d42c9c13a55c404d84091c6c5b57d7c2249b0cada1303a2a"} Feb 19 09:02:53 crc kubenswrapper[4788]: I0219 09:02:53.128626 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.109893 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" containerName="cinder-scheduler" containerID="cri-o://d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459" gracePeriod=30 Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.109926 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" containerName="probe" containerID="cri-o://bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b" gracePeriod=30 Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.432745 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.581745 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-config\") pod \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.582258 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-swift-storage-0\") pod \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.582284 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-nb\") pod \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.582326 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n2p6\" (UniqueName: \"kubernetes.io/projected/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-kube-api-access-9n2p6\") pod \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.582361 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-sb\") pod \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.582427 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-svc\") pod \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\" (UID: \"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5\") " Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.592221 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-kube-api-access-9n2p6" (OuterVolumeSpecName: "kube-api-access-9n2p6") pod "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" (UID: "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5"). InnerVolumeSpecName "kube-api-access-9n2p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.662581 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" (UID: "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.668594 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-config" (OuterVolumeSpecName: "config") pod "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" (UID: "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.669479 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" (UID: "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:54 crc kubenswrapper[4788]: E0219 09:02:54.677213 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.677765 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" (UID: "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.678889 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" (UID: "6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.684513 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.684649 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.684724 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.684782 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n2p6\" (UniqueName: \"kubernetes.io/projected/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-kube-api-access-9n2p6\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.684837 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.684897 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:54 crc kubenswrapper[4788]: I0219 09:02:54.726565 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.121481 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"604aa3ed-40d6-437a-93f3-0e7a445b862b","Type":"ContainerStarted","Data":"9ab59849b5b10d6b19f87cbd76d43e47d88723285de01267062056e3b29b2860"} Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.122341 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="ceilometer-notification-agent" containerID="cri-o://eec4936a279633822f5eec8d3f3413a25656e7bb6fb479f1dbb584f10dd3cb00" gracePeriod=30 Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.122517 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.122918 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="proxy-httpd" containerID="cri-o://9ab59849b5b10d6b19f87cbd76d43e47d88723285de01267062056e3b29b2860" gracePeriod=30 Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.123058 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="sg-core" containerID="cri-o://2a5dbb1d57bfd2b196066fab31c23d95bb570bcff4081dec0df1c58832954eb8" gracePeriod=30 Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.127833 4788 generic.go:334] "Generic (PLEG): container finished" podID="183a54b2-c97f-47de-93f6-024d497fffe6" containerID="bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b" exitCode=0 Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.127963 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"183a54b2-c97f-47de-93f6-024d497fffe6","Type":"ContainerDied","Data":"bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b"} Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.130028 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" event={"ID":"6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5","Type":"ContainerDied","Data":"5a902fa7009d3b217fa0e5e1e4a731510888bcabb4f31d46ca30d712bfa51526"} Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.130071 4788 scope.go:117] "RemoveContainer" containerID="b819790621b6d390d42c9c13a55c404d84091c6c5b57d7c2249b0cada1303a2a" Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.130281 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-tlwft" Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.132691 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56db6bc974-wjvtx" event={"ID":"817ea3c2-ff5a-475b-a6cd-295c84c9d02c","Type":"ContainerStarted","Data":"38766eff467896538c0f922b4cd108b43f7c83f2d5996ac1a65e80431b6d709a"} Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.132824 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56db6bc974-wjvtx" event={"ID":"817ea3c2-ff5a-475b-a6cd-295c84c9d02c","Type":"ContainerStarted","Data":"26ee76802de40148795d49587e7e7fe82ee998631d4320d3dd64f5a8cf67a7fa"} Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.132912 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.132990 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.155131 4788 scope.go:117] "RemoveContainer" containerID="b4f34c29986d5b69dea0638c001f1f96d6e7e3fcb4c32c043eae0572802c4112" Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.191845 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56db6bc974-wjvtx" podStartSLOduration=8.191827388 podStartE2EDuration="8.191827388s" podCreationTimestamp="2026-02-19 09:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:02:55.166203473 +0000 UTC m=+1077.154214955" watchObservedRunningTime="2026-02-19 09:02:55.191827388 +0000 UTC m=+1077.179838860" Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.192855 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tlwft"] Feb 19 09:02:55 crc kubenswrapper[4788]: I0219 09:02:55.203784 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-tlwft"] Feb 19 09:02:56 crc kubenswrapper[4788]: I0219 09:02:56.144919 4788 generic.go:334] "Generic (PLEG): container finished" podID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerID="9ab59849b5b10d6b19f87cbd76d43e47d88723285de01267062056e3b29b2860" exitCode=0 Feb 19 09:02:56 crc kubenswrapper[4788]: I0219 09:02:56.145268 4788 generic.go:334] "Generic (PLEG): container finished" podID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerID="2a5dbb1d57bfd2b196066fab31c23d95bb570bcff4081dec0df1c58832954eb8" exitCode=2 Feb 19 09:02:56 crc kubenswrapper[4788]: I0219 09:02:56.144991 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"604aa3ed-40d6-437a-93f3-0e7a445b862b","Type":"ContainerDied","Data":"9ab59849b5b10d6b19f87cbd76d43e47d88723285de01267062056e3b29b2860"} Feb 19 09:02:56 crc kubenswrapper[4788]: I0219 09:02:56.145363 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"604aa3ed-40d6-437a-93f3-0e7a445b862b","Type":"ContainerDied","Data":"2a5dbb1d57bfd2b196066fab31c23d95bb570bcff4081dec0df1c58832954eb8"} Feb 19 09:02:56 crc kubenswrapper[4788]: E0219 09:02:56.712328 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod604aa3ed_40d6_437a_93f3_0e7a445b862b.slice/crio-eec4936a279633822f5eec8d3f3413a25656e7bb6fb479f1dbb584f10dd3cb00.scope\": RecentStats: unable to find data in memory cache]" Feb 19 09:02:56 crc kubenswrapper[4788]: I0219 09:02:56.729095 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" path="/var/lib/kubelet/pods/6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5/volumes" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.163004 4788 generic.go:334] "Generic (PLEG): container finished" podID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerID="eec4936a279633822f5eec8d3f3413a25656e7bb6fb479f1dbb584f10dd3cb00" exitCode=0 Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.163066 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"604aa3ed-40d6-437a-93f3-0e7a445b862b","Type":"ContainerDied","Data":"eec4936a279633822f5eec8d3f3413a25656e7bb6fb479f1dbb584f10dd3cb00"} Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.551767 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.743791 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-scripts\") pod \"604aa3ed-40d6-437a-93f3-0e7a445b862b\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.744488 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-sg-core-conf-yaml\") pod \"604aa3ed-40d6-437a-93f3-0e7a445b862b\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.744633 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r69hc\" (UniqueName: \"kubernetes.io/projected/604aa3ed-40d6-437a-93f3-0e7a445b862b-kube-api-access-r69hc\") pod \"604aa3ed-40d6-437a-93f3-0e7a445b862b\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.744728 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-config-data\") pod \"604aa3ed-40d6-437a-93f3-0e7a445b862b\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.744801 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-run-httpd\") pod \"604aa3ed-40d6-437a-93f3-0e7a445b862b\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.744932 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-combined-ca-bundle\") pod \"604aa3ed-40d6-437a-93f3-0e7a445b862b\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.745006 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-log-httpd\") pod \"604aa3ed-40d6-437a-93f3-0e7a445b862b\" (UID: \"604aa3ed-40d6-437a-93f3-0e7a445b862b\") " Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.745113 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "604aa3ed-40d6-437a-93f3-0e7a445b862b" (UID: "604aa3ed-40d6-437a-93f3-0e7a445b862b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.745734 4788 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.746300 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "604aa3ed-40d6-437a-93f3-0e7a445b862b" (UID: "604aa3ed-40d6-437a-93f3-0e7a445b862b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.763111 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604aa3ed-40d6-437a-93f3-0e7a445b862b-kube-api-access-r69hc" (OuterVolumeSpecName: "kube-api-access-r69hc") pod "604aa3ed-40d6-437a-93f3-0e7a445b862b" (UID: "604aa3ed-40d6-437a-93f3-0e7a445b862b"). InnerVolumeSpecName "kube-api-access-r69hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.773828 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-scripts" (OuterVolumeSpecName: "scripts") pod "604aa3ed-40d6-437a-93f3-0e7a445b862b" (UID: "604aa3ed-40d6-437a-93f3-0e7a445b862b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.782948 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "604aa3ed-40d6-437a-93f3-0e7a445b862b" (UID: "604aa3ed-40d6-437a-93f3-0e7a445b862b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.824291 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "604aa3ed-40d6-437a-93f3-0e7a445b862b" (UID: "604aa3ed-40d6-437a-93f3-0e7a445b862b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.839071 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-config-data" (OuterVolumeSpecName: "config-data") pod "604aa3ed-40d6-437a-93f3-0e7a445b862b" (UID: "604aa3ed-40d6-437a-93f3-0e7a445b862b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.847157 4788 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.847278 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r69hc\" (UniqueName: \"kubernetes.io/projected/604aa3ed-40d6-437a-93f3-0e7a445b862b-kube-api-access-r69hc\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.847299 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.847310 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.847321 4788 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/604aa3ed-40d6-437a-93f3-0e7a445b862b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:57 crc kubenswrapper[4788]: I0219 09:02:57.847331 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604aa3ed-40d6-437a-93f3-0e7a445b862b-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.183153 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"604aa3ed-40d6-437a-93f3-0e7a445b862b","Type":"ContainerDied","Data":"4cf532b6079b581b5ba944a8188f27318753a4102ffc74eddfc37f713c6f5e1c"} Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.183193 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.183263 4788 scope.go:117] "RemoveContainer" containerID="9ab59849b5b10d6b19f87cbd76d43e47d88723285de01267062056e3b29b2860" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.307801 4788 scope.go:117] "RemoveContainer" containerID="2a5dbb1d57bfd2b196066fab31c23d95bb570bcff4081dec0df1c58832954eb8" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.369760 4788 scope.go:117] "RemoveContainer" containerID="eec4936a279633822f5eec8d3f3413a25656e7bb6fb479f1dbb584f10dd3cb00" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.379012 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.393309 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.402405 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:02:58 crc kubenswrapper[4788]: E0219 09:02:58.402878 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" containerName="dnsmasq-dns" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.402897 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" containerName="dnsmasq-dns" Feb 19 09:02:58 crc kubenswrapper[4788]: E0219 09:02:58.402911 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="proxy-httpd" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.402920 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="proxy-httpd" Feb 19 09:02:58 crc kubenswrapper[4788]: E0219 09:02:58.402940 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="ceilometer-notification-agent" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.402949 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="ceilometer-notification-agent" Feb 19 09:02:58 crc kubenswrapper[4788]: E0219 09:02:58.402956 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" containerName="init" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.402962 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" containerName="init" Feb 19 09:02:58 crc kubenswrapper[4788]: E0219 09:02:58.402978 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="sg-core" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.402983 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="sg-core" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.403171 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="sg-core" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.403194 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="proxy-httpd" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.403205 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c7bdb9f-4455-4be6-b8e5-d0fa7713a2a5" containerName="dnsmasq-dns" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.403215 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" containerName="ceilometer-notification-agent" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.405208 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.410939 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.412437 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.413461 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.463220 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.463304 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.463348 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-run-httpd\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.463455 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-config-data\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.463507 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-log-httpd\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.463545 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg5jf\" (UniqueName: \"kubernetes.io/projected/5e77349a-c322-4ed1-ba26-8d952f277ba2-kube-api-access-vg5jf\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.463606 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-scripts\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.566610 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.566732 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.566804 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-run-httpd\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.566863 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-config-data\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.566892 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-log-httpd\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.566926 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg5jf\" (UniqueName: \"kubernetes.io/projected/5e77349a-c322-4ed1-ba26-8d952f277ba2-kube-api-access-vg5jf\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.566973 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-scripts\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.571720 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-run-httpd\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.572371 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-log-httpd\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.572967 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.573448 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-config-data\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.574522 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-scripts\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.585836 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.600522 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg5jf\" (UniqueName: \"kubernetes.io/projected/5e77349a-c322-4ed1-ba26-8d952f277ba2-kube-api-access-vg5jf\") pod \"ceilometer-0\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.635349 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.669993 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data-custom\") pod \"183a54b2-c97f-47de-93f6-024d497fffe6\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.670083 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data\") pod \"183a54b2-c97f-47de-93f6-024d497fffe6\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.670189 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-scripts\") pod \"183a54b2-c97f-47de-93f6-024d497fffe6\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.670211 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183a54b2-c97f-47de-93f6-024d497fffe6-etc-machine-id\") pod \"183a54b2-c97f-47de-93f6-024d497fffe6\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.670458 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/183a54b2-c97f-47de-93f6-024d497fffe6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "183a54b2-c97f-47de-93f6-024d497fffe6" (UID: "183a54b2-c97f-47de-93f6-024d497fffe6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.670478 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkv7t\" (UniqueName: \"kubernetes.io/projected/183a54b2-c97f-47de-93f6-024d497fffe6-kube-api-access-qkv7t\") pod \"183a54b2-c97f-47de-93f6-024d497fffe6\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.670576 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-combined-ca-bundle\") pod \"183a54b2-c97f-47de-93f6-024d497fffe6\" (UID: \"183a54b2-c97f-47de-93f6-024d497fffe6\") " Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.671263 4788 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/183a54b2-c97f-47de-93f6-024d497fffe6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.673573 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "183a54b2-c97f-47de-93f6-024d497fffe6" (UID: "183a54b2-c97f-47de-93f6-024d497fffe6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.682107 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183a54b2-c97f-47de-93f6-024d497fffe6-kube-api-access-qkv7t" (OuterVolumeSpecName: "kube-api-access-qkv7t") pod "183a54b2-c97f-47de-93f6-024d497fffe6" (UID: "183a54b2-c97f-47de-93f6-024d497fffe6"). InnerVolumeSpecName "kube-api-access-qkv7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.685045 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-scripts" (OuterVolumeSpecName: "scripts") pod "183a54b2-c97f-47de-93f6-024d497fffe6" (UID: "183a54b2-c97f-47de-93f6-024d497fffe6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.726808 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604aa3ed-40d6-437a-93f3-0e7a445b862b" path="/var/lib/kubelet/pods/604aa3ed-40d6-437a-93f3-0e7a445b862b/volumes" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.749365 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.757527 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "183a54b2-c97f-47de-93f6-024d497fffe6" (UID: "183a54b2-c97f-47de-93f6-024d497fffe6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.773129 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.773764 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkv7t\" (UniqueName: \"kubernetes.io/projected/183a54b2-c97f-47de-93f6-024d497fffe6-kube-api-access-qkv7t\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.773852 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.773957 4788 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.810387 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data" (OuterVolumeSpecName: "config-data") pod "183a54b2-c97f-47de-93f6-024d497fffe6" (UID: "183a54b2-c97f-47de-93f6-024d497fffe6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:02:58 crc kubenswrapper[4788]: I0219 09:02:58.875808 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/183a54b2-c97f-47de-93f6-024d497fffe6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.195379 4788 generic.go:334] "Generic (PLEG): container finished" podID="183a54b2-c97f-47de-93f6-024d497fffe6" containerID="d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459" exitCode=0 Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.195426 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"183a54b2-c97f-47de-93f6-024d497fffe6","Type":"ContainerDied","Data":"d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459"} Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.195493 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"183a54b2-c97f-47de-93f6-024d497fffe6","Type":"ContainerDied","Data":"a725a4a9bb14f4a444b347127fe0dbd274895111573766892463d0f9fef6cef0"} Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.195509 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.195517 4788 scope.go:117] "RemoveContainer" containerID="bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.221492 4788 scope.go:117] "RemoveContainer" containerID="d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.243327 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.258566 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.281331 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.281421 4788 scope.go:117] "RemoveContainer" containerID="bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b" Feb 19 09:02:59 crc kubenswrapper[4788]: E0219 09:02:59.281963 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b\": container with ID starting with bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b not found: ID does not exist" containerID="bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.282009 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b"} err="failed to get container status \"bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b\": rpc error: code = NotFound desc = could not find container \"bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b\": container with ID starting with bf00fffc8210be3910f9cda17eb98176c0d282481eac24a7faeae5c17c3a5b0b not found: ID does not exist" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.282032 4788 scope.go:117] "RemoveContainer" containerID="d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459" Feb 19 09:02:59 crc kubenswrapper[4788]: E0219 09:02:59.283894 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459\": container with ID starting with d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459 not found: ID does not exist" containerID="d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.283922 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459"} err="failed to get container status \"d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459\": rpc error: code = NotFound desc = could not find container \"d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459\": container with ID starting with d648d3f5e76ef0d4d14f4edc6a20b25ba9556a21e385d2ab3adf88cfc07ad459 not found: ID does not exist" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.295348 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:02:59 crc kubenswrapper[4788]: E0219 09:02:59.295822 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" containerName="probe" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.295845 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" containerName="probe" Feb 19 09:02:59 crc kubenswrapper[4788]: E0219 09:02:59.295887 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" containerName="cinder-scheduler" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.295897 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" containerName="cinder-scheduler" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.296116 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" containerName="probe" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.296150 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" containerName="cinder-scheduler" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.297269 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.299667 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.306799 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.384416 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzczs\" (UniqueName: \"kubernetes.io/projected/fe790dae-c901-4447-9716-8b3e366c08a0-kube-api-access-hzczs\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.384497 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-scripts\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.384529 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.384554 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-config-data\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.384624 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.384663 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe790dae-c901-4447-9716-8b3e366c08a0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.486662 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-scripts\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.486718 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.486746 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-config-data\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.486819 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.486864 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe790dae-c901-4447-9716-8b3e366c08a0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.486884 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzczs\" (UniqueName: \"kubernetes.io/projected/fe790dae-c901-4447-9716-8b3e366c08a0-kube-api-access-hzczs\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.487730 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fe790dae-c901-4447-9716-8b3e366c08a0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.492626 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-scripts\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.492652 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.493227 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-config-data\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.493808 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe790dae-c901-4447-9716-8b3e366c08a0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.520566 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzczs\" (UniqueName: \"kubernetes.io/projected/fe790dae-c901-4447-9716-8b3e366c08a0-kube-api-access-hzczs\") pod \"cinder-scheduler-0\" (UID: \"fe790dae-c901-4447-9716-8b3e366c08a0\") " pod="openstack/cinder-scheduler-0" Feb 19 09:02:59 crc kubenswrapper[4788]: I0219 09:02:59.643153 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 19 09:03:00 crc kubenswrapper[4788]: I0219 09:03:00.086782 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 19 09:03:00 crc kubenswrapper[4788]: I0219 09:03:00.207702 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerStarted","Data":"450c715c81711600c15286192e3d13329d9828d9ee8bc132208de3721be34c4c"} Feb 19 09:03:00 crc kubenswrapper[4788]: I0219 09:03:00.210523 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fe790dae-c901-4447-9716-8b3e366c08a0","Type":"ContainerStarted","Data":"575207d6a305d7878b4da0d93241efd82131bef28b146009edfd578b0144d51f"} Feb 19 09:03:00 crc kubenswrapper[4788]: I0219 09:03:00.726281 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183a54b2-c97f-47de-93f6-024d497fffe6" path="/var/lib/kubelet/pods/183a54b2-c97f-47de-93f6-024d497fffe6/volumes" Feb 19 09:03:01 crc kubenswrapper[4788]: I0219 09:03:01.230079 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fe790dae-c901-4447-9716-8b3e366c08a0","Type":"ContainerStarted","Data":"d014b04a49ce97720905ab62229a48fc103d8ce5e775039ec10b584cc01c3b83"} Feb 19 09:03:01 crc kubenswrapper[4788]: I0219 09:03:01.236042 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerStarted","Data":"07b2cdcbf1b7746016e60e8636cf24f0de6f2eef452e01edf5f6672364a87a54"} Feb 19 09:03:02 crc kubenswrapper[4788]: I0219 09:03:02.246880 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerStarted","Data":"38d69d6e4ae8079f6c7fb2b2018bd736e0b05f2d95d8e80427ad9d01640c00c3"} Feb 19 09:03:02 crc kubenswrapper[4788]: I0219 09:03:02.247266 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerStarted","Data":"68444d649b15354e7a4db1446cf27e0aeecb3d6ebf05c580f1c04ac72da27060"} Feb 19 09:03:02 crc kubenswrapper[4788]: I0219 09:03:02.248569 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fe790dae-c901-4447-9716-8b3e366c08a0","Type":"ContainerStarted","Data":"070e00ff591004568b409a2c19393ff40173dddd20a7f5c340b5bfea710cf28e"} Feb 19 09:03:02 crc kubenswrapper[4788]: I0219 09:03:02.264635 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.264620198 podStartE2EDuration="3.264620198s" podCreationTimestamp="2026-02-19 09:02:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:02.264059105 +0000 UTC m=+1084.252070587" watchObservedRunningTime="2026-02-19 09:03:02.264620198 +0000 UTC m=+1084.252631660" Feb 19 09:03:04 crc kubenswrapper[4788]: I0219 09:03:04.372397 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-748f6c7c59-q59qb" Feb 19 09:03:04 crc kubenswrapper[4788]: I0219 09:03:04.440030 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:03:04 crc kubenswrapper[4788]: I0219 09:03:04.644119 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 19 09:03:04 crc kubenswrapper[4788]: I0219 09:03:04.757657 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56db6bc974-wjvtx" Feb 19 09:03:04 crc kubenswrapper[4788]: I0219 09:03:04.827080 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54b6649458-8rbkh"] Feb 19 09:03:04 crc kubenswrapper[4788]: I0219 09:03:04.827318 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54b6649458-8rbkh" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api-log" containerID="cri-o://e03f6721cae8cdf981bb9755146b82c7557357151ac5e734d6dd218b83ffe87e" gracePeriod=30 Feb 19 09:03:04 crc kubenswrapper[4788]: I0219 09:03:04.827692 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-54b6649458-8rbkh" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api" containerID="cri-o://4ad1b7c830e57d5b6f53c0a0cac3edca8ef3c10808e0aaa87a07eb117c6aa64f" gracePeriod=30 Feb 19 09:03:05 crc kubenswrapper[4788]: I0219 09:03:05.288091 4788 generic.go:334] "Generic (PLEG): container finished" podID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerID="e03f6721cae8cdf981bb9755146b82c7557357151ac5e734d6dd218b83ffe87e" exitCode=143 Feb 19 09:03:05 crc kubenswrapper[4788]: I0219 09:03:05.288350 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b6649458-8rbkh" event={"ID":"f9e7e6e3-ef8d-4863-a024-61596fa46d51","Type":"ContainerDied","Data":"e03f6721cae8cdf981bb9755146b82c7557357151ac5e734d6dd218b83ffe87e"} Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.307479 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerStarted","Data":"15b4aa9bba267916367e901ef1b5ea45c574c8435626778143f28416523c9cc1"} Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.309478 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.345862 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.690431596 podStartE2EDuration="8.345830217s" podCreationTimestamp="2026-02-19 09:02:58 +0000 UTC" firstStartedPulling="2026-02-19 09:02:59.281397968 +0000 UTC m=+1081.269409440" lastFinishedPulling="2026-02-19 09:03:04.936796589 +0000 UTC m=+1086.924808061" observedRunningTime="2026-02-19 09:03:06.329107386 +0000 UTC m=+1088.317118888" watchObservedRunningTime="2026-02-19 09:03:06.345830217 +0000 UTC m=+1088.333841699" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.702735 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.704053 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.707370 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.707571 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.709668 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gwrsz" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.730413 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.842519 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.842775 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.842876 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvrj\" (UniqueName: \"kubernetes.io/projected/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-kube-api-access-dpvrj\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.842910 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.986802 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.987115 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.987291 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvrj\" (UniqueName: \"kubernetes.io/projected/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-kube-api-access-dpvrj\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.987419 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.988268 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.994754 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config-secret\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:06 crc kubenswrapper[4788]: I0219 09:03:06.998336 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.023229 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvrj\" (UniqueName: \"kubernetes.io/projected/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-kube-api-access-dpvrj\") pod \"openstackclient\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.027172 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.039325 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.050335 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.086181 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.087496 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.088832 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dd204842-7289-418d-a4d1-e0d079e368b3-openstack-config-secret\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.088890 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd204842-7289-418d-a4d1-e0d079e368b3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.088945 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc2bp\" (UniqueName: \"kubernetes.io/projected/dd204842-7289-418d-a4d1-e0d079e368b3-kube-api-access-xc2bp\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.088992 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dd204842-7289-418d-a4d1-e0d079e368b3-openstack-config\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.100156 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 09:03:07 crc kubenswrapper[4788]: E0219 09:03:07.178173 4788 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 19 09:03:07 crc kubenswrapper[4788]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca_0(f0207393c175f7fd4fd32206ea9025191f132c6205462078708caabcafafb7c8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f0207393c175f7fd4fd32206ea9025191f132c6205462078708caabcafafb7c8" Netns:"/var/run/netns/31f3b39e-28d9-408a-846c-a169c4d07267" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f0207393c175f7fd4fd32206ea9025191f132c6205462078708caabcafafb7c8;K8S_POD_UID=ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca]: expected pod UID "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" but got "dd204842-7289-418d-a4d1-e0d079e368b3" from Kube API Feb 19 09:03:07 crc kubenswrapper[4788]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 09:03:07 crc kubenswrapper[4788]: > Feb 19 09:03:07 crc kubenswrapper[4788]: E0219 09:03:07.178264 4788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 19 09:03:07 crc kubenswrapper[4788]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca_0(f0207393c175f7fd4fd32206ea9025191f132c6205462078708caabcafafb7c8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f0207393c175f7fd4fd32206ea9025191f132c6205462078708caabcafafb7c8" Netns:"/var/run/netns/31f3b39e-28d9-408a-846c-a169c4d07267" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=f0207393c175f7fd4fd32206ea9025191f132c6205462078708caabcafafb7c8;K8S_POD_UID=ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca]: expected pod UID "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" but got "dd204842-7289-418d-a4d1-e0d079e368b3" from Kube API Feb 19 09:03:07 crc kubenswrapper[4788]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 19 09:03:07 crc kubenswrapper[4788]: > pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.191092 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dd204842-7289-418d-a4d1-e0d079e368b3-openstack-config\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.191213 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dd204842-7289-418d-a4d1-e0d079e368b3-openstack-config-secret\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.191268 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd204842-7289-418d-a4d1-e0d079e368b3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.191346 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc2bp\" (UniqueName: \"kubernetes.io/projected/dd204842-7289-418d-a4d1-e0d079e368b3-kube-api-access-xc2bp\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.192539 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dd204842-7289-418d-a4d1-e0d079e368b3-openstack-config\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.196294 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dd204842-7289-418d-a4d1-e0d079e368b3-openstack-config-secret\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.202527 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd204842-7289-418d-a4d1-e0d079e368b3-combined-ca-bundle\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.211128 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc2bp\" (UniqueName: \"kubernetes.io/projected/dd204842-7289-418d-a4d1-e0d079e368b3-kube-api-access-xc2bp\") pod \"openstackclient\" (UID: \"dd204842-7289-418d-a4d1-e0d079e368b3\") " pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.315553 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.319343 4788 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" podUID="dd204842-7289-418d-a4d1-e0d079e368b3" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.327643 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.395580 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpvrj\" (UniqueName: \"kubernetes.io/projected/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-kube-api-access-dpvrj\") pod \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.395650 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-combined-ca-bundle\") pod \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.399515 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-kube-api-access-dpvrj" (OuterVolumeSpecName: "kube-api-access-dpvrj") pod "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" (UID: "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca"). InnerVolumeSpecName "kube-api-access-dpvrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.399918 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" (UID: "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.451390 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.499806 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config-secret\") pod \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.499945 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config\") pod \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\" (UID: \"ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca\") " Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.500720 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" (UID: "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.503034 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.503067 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpvrj\" (UniqueName: \"kubernetes.io/projected/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-kube-api-access-dpvrj\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.503083 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.505868 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" (UID: "ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.604328 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:07 crc kubenswrapper[4788]: I0219 09:03:07.929598 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.019117 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54b6649458-8rbkh" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:49970->10.217.0.160:9311: read: connection reset by peer" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.019132 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-54b6649458-8rbkh" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:49966->10.217.0.160:9311: read: connection reset by peer" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.331707 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"dd204842-7289-418d-a4d1-e0d079e368b3","Type":"ContainerStarted","Data":"be4699f6b8807d8c837f4601177dffb605f3044ea8935c01fe3bd7fe26274dcd"} Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.345030 4788 generic.go:334] "Generic (PLEG): container finished" podID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerID="4ad1b7c830e57d5b6f53c0a0cac3edca8ef3c10808e0aaa87a07eb117c6aa64f" exitCode=0 Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.345103 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.345106 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b6649458-8rbkh" event={"ID":"f9e7e6e3-ef8d-4863-a024-61596fa46d51","Type":"ContainerDied","Data":"4ad1b7c830e57d5b6f53c0a0cac3edca8ef3c10808e0aaa87a07eb117c6aa64f"} Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.347826 4788 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" podUID="dd204842-7289-418d-a4d1-e0d079e368b3" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.638024 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.721729 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvm4q\" (UniqueName: \"kubernetes.io/projected/f9e7e6e3-ef8d-4863-a024-61596fa46d51-kube-api-access-mvm4q\") pod \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.722198 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-combined-ca-bundle\") pod \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.722326 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data-custom\") pod \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.722371 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9e7e6e3-ef8d-4863-a024-61596fa46d51-logs\") pod \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.722427 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data\") pod \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\" (UID: \"f9e7e6e3-ef8d-4863-a024-61596fa46d51\") " Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.722832 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e7e6e3-ef8d-4863-a024-61596fa46d51-logs" (OuterVolumeSpecName: "logs") pod "f9e7e6e3-ef8d-4863-a024-61596fa46d51" (UID: "f9e7e6e3-ef8d-4863-a024-61596fa46d51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.722994 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9e7e6e3-ef8d-4863-a024-61596fa46d51-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.738854 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca" path="/var/lib/kubelet/pods/ae7e89cf-9ee2-43d3-befc-41e9f9c6d0ca/volumes" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.740445 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f9e7e6e3-ef8d-4863-a024-61596fa46d51" (UID: "f9e7e6e3-ef8d-4863-a024-61596fa46d51"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.747418 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e7e6e3-ef8d-4863-a024-61596fa46d51-kube-api-access-mvm4q" (OuterVolumeSpecName: "kube-api-access-mvm4q") pod "f9e7e6e3-ef8d-4863-a024-61596fa46d51" (UID: "f9e7e6e3-ef8d-4863-a024-61596fa46d51"). InnerVolumeSpecName "kube-api-access-mvm4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.809670 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9e7e6e3-ef8d-4863-a024-61596fa46d51" (UID: "f9e7e6e3-ef8d-4863-a024-61596fa46d51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.826061 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.826089 4788 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.826098 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvm4q\" (UniqueName: \"kubernetes.io/projected/f9e7e6e3-ef8d-4863-a024-61596fa46d51-kube-api-access-mvm4q\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:08 crc kubenswrapper[4788]: I0219 09:03:08.939818 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data" (OuterVolumeSpecName: "config-data") pod "f9e7e6e3-ef8d-4863-a024-61596fa46d51" (UID: "f9e7e6e3-ef8d-4863-a024-61596fa46d51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:09 crc kubenswrapper[4788]: I0219 09:03:09.028856 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9e7e6e3-ef8d-4863-a024-61596fa46d51-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:09 crc kubenswrapper[4788]: I0219 09:03:09.358821 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54b6649458-8rbkh" event={"ID":"f9e7e6e3-ef8d-4863-a024-61596fa46d51","Type":"ContainerDied","Data":"08ea41ff5e49273b9e1172295e2d60cb625da8effe1941616cbdf94c3cffc4ae"} Feb 19 09:03:09 crc kubenswrapper[4788]: I0219 09:03:09.358885 4788 scope.go:117] "RemoveContainer" containerID="4ad1b7c830e57d5b6f53c0a0cac3edca8ef3c10808e0aaa87a07eb117c6aa64f" Feb 19 09:03:09 crc kubenswrapper[4788]: I0219 09:03:09.358893 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54b6649458-8rbkh" Feb 19 09:03:09 crc kubenswrapper[4788]: I0219 09:03:09.400751 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-54b6649458-8rbkh"] Feb 19 09:03:09 crc kubenswrapper[4788]: I0219 09:03:09.403965 4788 scope.go:117] "RemoveContainer" containerID="e03f6721cae8cdf981bb9755146b82c7557357151ac5e734d6dd218b83ffe87e" Feb 19 09:03:09 crc kubenswrapper[4788]: I0219 09:03:09.408143 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-54b6649458-8rbkh"] Feb 19 09:03:09 crc kubenswrapper[4788]: I0219 09:03:09.852824 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 19 09:03:10 crc kubenswrapper[4788]: I0219 09:03:10.739344 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" path="/var/lib/kubelet/pods/f9e7e6e3-ef8d-4863-a024-61596fa46d51/volumes" Feb 19 09:03:11 crc kubenswrapper[4788]: I0219 09:03:11.335736 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:03:11 crc kubenswrapper[4788]: I0219 09:03:11.372442 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-86464574f6-lv4mn" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.034206 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-58576d599f-zjnz8"] Feb 19 09:03:12 crc kubenswrapper[4788]: E0219 09:03:12.034717 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.034734 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api" Feb 19 09:03:12 crc kubenswrapper[4788]: E0219 09:03:12.034750 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api-log" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.034757 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api-log" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.034968 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.034996 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e7e6e3-ef8d-4863-a024-61596fa46d51" containerName="barbican-api-log" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.036111 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.038476 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.039917 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.043386 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.047602 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-647d45fc9-99jnj" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.056945 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58576d599f-zjnz8"] Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.157931 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c6b98d796-z866s"] Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.158374 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c6b98d796-z866s" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerName="neutron-api" containerID="cri-o://5286348298ae9e5284ff477c31c3d706f166c1dcd16b4e678a3f30e44509763c" gracePeriod=30 Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.158862 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c6b98d796-z866s" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerName="neutron-httpd" containerID="cri-o://2644ab20c671b994a5162da9472b9b4f0b23642d7e291858db5645285635078c" gracePeriod=30 Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.201464 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-internal-tls-certs\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.201570 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-combined-ca-bundle\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.201601 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6afa4604-515a-4774-9d5f-e641cb256988-run-httpd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.201664 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-public-tls-certs\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.201737 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-config-data\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.202541 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6afa4604-515a-4774-9d5f-e641cb256988-etc-swift\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.202695 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6afa4604-515a-4774-9d5f-e641cb256988-log-httpd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.202727 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kscrd\" (UniqueName: \"kubernetes.io/projected/6afa4604-515a-4774-9d5f-e641cb256988-kube-api-access-kscrd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.306275 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6afa4604-515a-4774-9d5f-e641cb256988-log-httpd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.306320 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kscrd\" (UniqueName: \"kubernetes.io/projected/6afa4604-515a-4774-9d5f-e641cb256988-kube-api-access-kscrd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.306368 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-internal-tls-certs\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.306416 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-combined-ca-bundle\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.306432 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6afa4604-515a-4774-9d5f-e641cb256988-run-httpd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.306461 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-public-tls-certs\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.306492 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-config-data\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.306509 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6afa4604-515a-4774-9d5f-e641cb256988-etc-swift\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.307139 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6afa4604-515a-4774-9d5f-e641cb256988-log-httpd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.307611 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6afa4604-515a-4774-9d5f-e641cb256988-run-httpd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.316323 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-public-tls-certs\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.326298 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-config-data\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.326981 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kscrd\" (UniqueName: \"kubernetes.io/projected/6afa4604-515a-4774-9d5f-e641cb256988-kube-api-access-kscrd\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.330660 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6afa4604-515a-4774-9d5f-e641cb256988-etc-swift\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.335839 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-combined-ca-bundle\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.340134 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6afa4604-515a-4774-9d5f-e641cb256988-internal-tls-certs\") pod \"swift-proxy-58576d599f-zjnz8\" (UID: \"6afa4604-515a-4774-9d5f-e641cb256988\") " pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.363656 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.399375 4788 generic.go:334] "Generic (PLEG): container finished" podID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerID="2644ab20c671b994a5162da9472b9b4f0b23642d7e291858db5645285635078c" exitCode=0 Feb 19 09:03:12 crc kubenswrapper[4788]: I0219 09:03:12.399418 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b98d796-z866s" event={"ID":"b4c17192-39ae-4520-aed0-f6325410b6b6","Type":"ContainerDied","Data":"2644ab20c671b994a5162da9472b9b4f0b23642d7e291858db5645285635078c"} Feb 19 09:03:13 crc kubenswrapper[4788]: I0219 09:03:13.034046 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58576d599f-zjnz8"] Feb 19 09:03:13 crc kubenswrapper[4788]: I0219 09:03:13.416983 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58576d599f-zjnz8" event={"ID":"6afa4604-515a-4774-9d5f-e641cb256988","Type":"ContainerStarted","Data":"baf6cc6a3186c5c8e3ff56241a475ffb387c2ae5f28bbc744466ecf90ea510ff"} Feb 19 09:03:14 crc kubenswrapper[4788]: I0219 09:03:14.671475 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:14 crc kubenswrapper[4788]: I0219 09:03:14.672140 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="ceilometer-central-agent" containerID="cri-o://07b2cdcbf1b7746016e60e8636cf24f0de6f2eef452e01edf5f6672364a87a54" gracePeriod=30 Feb 19 09:03:14 crc kubenswrapper[4788]: I0219 09:03:14.672155 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="proxy-httpd" containerID="cri-o://15b4aa9bba267916367e901ef1b5ea45c574c8435626778143f28416523c9cc1" gracePeriod=30 Feb 19 09:03:14 crc kubenswrapper[4788]: I0219 09:03:14.672310 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="sg-core" containerID="cri-o://38d69d6e4ae8079f6c7fb2b2018bd736e0b05f2d95d8e80427ad9d01640c00c3" gracePeriod=30 Feb 19 09:03:14 crc kubenswrapper[4788]: I0219 09:03:14.672368 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="ceilometer-notification-agent" containerID="cri-o://68444d649b15354e7a4db1446cf27e0aeecb3d6ebf05c580f1c04ac72da27060" gracePeriod=30 Feb 19 09:03:15 crc kubenswrapper[4788]: I0219 09:03:15.447073 4788 generic.go:334] "Generic (PLEG): container finished" podID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerID="15b4aa9bba267916367e901ef1b5ea45c574c8435626778143f28416523c9cc1" exitCode=0 Feb 19 09:03:15 crc kubenswrapper[4788]: I0219 09:03:15.447104 4788 generic.go:334] "Generic (PLEG): container finished" podID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerID="38d69d6e4ae8079f6c7fb2b2018bd736e0b05f2d95d8e80427ad9d01640c00c3" exitCode=2 Feb 19 09:03:15 crc kubenswrapper[4788]: I0219 09:03:15.447111 4788 generic.go:334] "Generic (PLEG): container finished" podID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerID="07b2cdcbf1b7746016e60e8636cf24f0de6f2eef452e01edf5f6672364a87a54" exitCode=0 Feb 19 09:03:15 crc kubenswrapper[4788]: I0219 09:03:15.447128 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerDied","Data":"15b4aa9bba267916367e901ef1b5ea45c574c8435626778143f28416523c9cc1"} Feb 19 09:03:15 crc kubenswrapper[4788]: I0219 09:03:15.447152 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerDied","Data":"38d69d6e4ae8079f6c7fb2b2018bd736e0b05f2d95d8e80427ad9d01640c00c3"} Feb 19 09:03:15 crc kubenswrapper[4788]: I0219 09:03:15.447161 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerDied","Data":"07b2cdcbf1b7746016e60e8636cf24f0de6f2eef452e01edf5f6672364a87a54"} Feb 19 09:03:17 crc kubenswrapper[4788]: W0219 09:03:17.131508 4788 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-15b4aa9bba267916367e901ef1b5ea45c574c8435626778143f28416523c9cc1.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-15b4aa9bba267916367e901ef1b5ea45c574c8435626778143f28416523c9cc1.scope: no such file or directory Feb 19 09:03:17 crc kubenswrapper[4788]: W0219 09:03:17.131852 4788 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae7e89cf_9ee2_43d3_befc_41e9f9c6d0ca.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae7e89cf_9ee2_43d3_befc_41e9f9c6d0ca.slice: no such file or directory Feb 19 09:03:17 crc kubenswrapper[4788]: I0219 09:03:17.296852 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.164:8776/healthcheck\": dial tcp 10.217.0.164:8776: connect: connection refused" Feb 19 09:03:17 crc kubenswrapper[4788]: I0219 09:03:17.468737 4788 generic.go:334] "Generic (PLEG): container finished" podID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerID="dbde9939c2b7c2c5f8b1a1d7b364a0ded140ebe391ea29a5be3ed919873994cd" exitCode=137 Feb 19 09:03:17 crc kubenswrapper[4788]: I0219 09:03:17.468790 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e924ad6-b71b-400f-af06-406e9e2341e4","Type":"ContainerDied","Data":"dbde9939c2b7c2c5f8b1a1d7b364a0ded140ebe391ea29a5be3ed919873994cd"} Feb 19 09:03:18 crc kubenswrapper[4788]: E0219 09:03:18.008917 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-conmon-07b2cdcbf1b7746016e60e8636cf24f0de6f2eef452e01edf5f6672364a87a54.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4c17192_39ae_4520_aed0_f6325410b6b6.slice/crio-2644ab20c671b994a5162da9472b9b4f0b23642d7e291858db5645285635078c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4c17192_39ae_4520_aed0_f6325410b6b6.slice/crio-conmon-5286348298ae9e5284ff477c31c3d706f166c1dcd16b4e678a3f30e44509763c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e7e6e3_ef8d_4863_a024_61596fa46d51.slice/crio-4ad1b7c830e57d5b6f53c0a0cac3edca8ef3c10808e0aaa87a07eb117c6aa64f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-07b2cdcbf1b7746016e60e8636cf24f0de6f2eef452e01edf5f6672364a87a54.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e924ad6_b71b_400f_af06_406e9e2341e4.slice/crio-dbde9939c2b7c2c5f8b1a1d7b364a0ded140ebe391ea29a5be3ed919873994cd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-38d69d6e4ae8079f6c7fb2b2018bd736e0b05f2d95d8e80427ad9d01640c00c3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-conmon-38d69d6e4ae8079f6c7fb2b2018bd736e0b05f2d95d8e80427ad9d01640c00c3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e7e6e3_ef8d_4863_a024_61596fa46d51.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-conmon-68444d649b15354e7a4db1446cf27e0aeecb3d6ebf05c580f1c04ac72da27060.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e7e6e3_ef8d_4863_a024_61596fa46d51.slice/crio-08ea41ff5e49273b9e1172295e2d60cb625da8effe1941616cbdf94c3cffc4ae\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e924ad6_b71b_400f_af06_406e9e2341e4.slice/crio-conmon-dbde9939c2b7c2c5f8b1a1d7b364a0ded140ebe391ea29a5be3ed919873994cd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-68444d649b15354e7a4db1446cf27e0aeecb3d6ebf05c580f1c04ac72da27060.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4c17192_39ae_4520_aed0_f6325410b6b6.slice/crio-conmon-2644ab20c671b994a5162da9472b9b4f0b23642d7e291858db5645285635078c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e77349a_c322_4ed1_ba26_8d952f277ba2.slice/crio-conmon-15b4aa9bba267916367e901ef1b5ea45c574c8435626778143f28416523c9cc1.scope\": RecentStats: unable to find data in memory cache]" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.484694 4788 generic.go:334] "Generic (PLEG): container finished" podID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerID="68444d649b15354e7a4db1446cf27e0aeecb3d6ebf05c580f1c04ac72da27060" exitCode=0 Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.484911 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerDied","Data":"68444d649b15354e7a4db1446cf27e0aeecb3d6ebf05c580f1c04ac72da27060"} Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.488092 4788 generic.go:334] "Generic (PLEG): container finished" podID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerID="5286348298ae9e5284ff477c31c3d706f166c1dcd16b4e678a3f30e44509763c" exitCode=0 Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.488140 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b98d796-z866s" event={"ID":"b4c17192-39ae-4520-aed0-f6325410b6b6","Type":"ContainerDied","Data":"5286348298ae9e5284ff477c31c3d706f166c1dcd16b4e678a3f30e44509763c"} Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.810590 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.947151 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-scripts\") pod \"9e924ad6-b71b-400f-af06-406e9e2341e4\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.947508 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e924ad6-b71b-400f-af06-406e9e2341e4-etc-machine-id\") pod \"9e924ad6-b71b-400f-af06-406e9e2341e4\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.947542 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc59r\" (UniqueName: \"kubernetes.io/projected/9e924ad6-b71b-400f-af06-406e9e2341e4-kube-api-access-rc59r\") pod \"9e924ad6-b71b-400f-af06-406e9e2341e4\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.947606 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data\") pod \"9e924ad6-b71b-400f-af06-406e9e2341e4\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.947658 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e924ad6-b71b-400f-af06-406e9e2341e4-logs\") pod \"9e924ad6-b71b-400f-af06-406e9e2341e4\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.947735 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data-custom\") pod \"9e924ad6-b71b-400f-af06-406e9e2341e4\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.947836 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-combined-ca-bundle\") pod \"9e924ad6-b71b-400f-af06-406e9e2341e4\" (UID: \"9e924ad6-b71b-400f-af06-406e9e2341e4\") " Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.948982 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e924ad6-b71b-400f-af06-406e9e2341e4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9e924ad6-b71b-400f-af06-406e9e2341e4" (UID: "9e924ad6-b71b-400f-af06-406e9e2341e4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.949554 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e924ad6-b71b-400f-af06-406e9e2341e4-logs" (OuterVolumeSpecName: "logs") pod "9e924ad6-b71b-400f-af06-406e9e2341e4" (UID: "9e924ad6-b71b-400f-af06-406e9e2341e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.955796 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.957023 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-scripts" (OuterVolumeSpecName: "scripts") pod "9e924ad6-b71b-400f-af06-406e9e2341e4" (UID: "9e924ad6-b71b-400f-af06-406e9e2341e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.959366 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9e924ad6-b71b-400f-af06-406e9e2341e4" (UID: "9e924ad6-b71b-400f-af06-406e9e2341e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.959789 4788 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.959809 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.959818 4788 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9e924ad6-b71b-400f-af06-406e9e2341e4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.959827 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e924ad6-b71b-400f-af06-406e9e2341e4-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.960194 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.968646 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e924ad6-b71b-400f-af06-406e9e2341e4-kube-api-access-rc59r" (OuterVolumeSpecName: "kube-api-access-rc59r") pod "9e924ad6-b71b-400f-af06-406e9e2341e4" (UID: "9e924ad6-b71b-400f-af06-406e9e2341e4"). InnerVolumeSpecName "kube-api-access-rc59r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:18 crc kubenswrapper[4788]: I0219 09:03:18.992180 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e924ad6-b71b-400f-af06-406e9e2341e4" (UID: "9e924ad6-b71b-400f-af06-406e9e2341e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.022394 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data" (OuterVolumeSpecName: "config-data") pod "9e924ad6-b71b-400f-af06-406e9e2341e4" (UID: "9e924ad6-b71b-400f-af06-406e9e2341e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061108 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grbjv\" (UniqueName: \"kubernetes.io/projected/b4c17192-39ae-4520-aed0-f6325410b6b6-kube-api-access-grbjv\") pod \"b4c17192-39ae-4520-aed0-f6325410b6b6\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061222 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-config-data\") pod \"5e77349a-c322-4ed1-ba26-8d952f277ba2\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061271 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg5jf\" (UniqueName: \"kubernetes.io/projected/5e77349a-c322-4ed1-ba26-8d952f277ba2-kube-api-access-vg5jf\") pod \"5e77349a-c322-4ed1-ba26-8d952f277ba2\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061304 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-combined-ca-bundle\") pod \"b4c17192-39ae-4520-aed0-f6325410b6b6\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061333 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-config\") pod \"b4c17192-39ae-4520-aed0-f6325410b6b6\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061401 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-httpd-config\") pod \"b4c17192-39ae-4520-aed0-f6325410b6b6\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061461 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-combined-ca-bundle\") pod \"5e77349a-c322-4ed1-ba26-8d952f277ba2\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061533 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-run-httpd\") pod \"5e77349a-c322-4ed1-ba26-8d952f277ba2\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061612 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-ovndb-tls-certs\") pod \"b4c17192-39ae-4520-aed0-f6325410b6b6\" (UID: \"b4c17192-39ae-4520-aed0-f6325410b6b6\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061639 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-log-httpd\") pod \"5e77349a-c322-4ed1-ba26-8d952f277ba2\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061667 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-scripts\") pod \"5e77349a-c322-4ed1-ba26-8d952f277ba2\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.061718 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-sg-core-conf-yaml\") pod \"5e77349a-c322-4ed1-ba26-8d952f277ba2\" (UID: \"5e77349a-c322-4ed1-ba26-8d952f277ba2\") " Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.062126 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc59r\" (UniqueName: \"kubernetes.io/projected/9e924ad6-b71b-400f-af06-406e9e2341e4-kube-api-access-rc59r\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.062148 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.062160 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e924ad6-b71b-400f-af06-406e9e2341e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.066388 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5e77349a-c322-4ed1-ba26-8d952f277ba2" (UID: "5e77349a-c322-4ed1-ba26-8d952f277ba2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.070090 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c17192-39ae-4520-aed0-f6325410b6b6-kube-api-access-grbjv" (OuterVolumeSpecName: "kube-api-access-grbjv") pod "b4c17192-39ae-4520-aed0-f6325410b6b6" (UID: "b4c17192-39ae-4520-aed0-f6325410b6b6"). InnerVolumeSpecName "kube-api-access-grbjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.070536 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5e77349a-c322-4ed1-ba26-8d952f277ba2" (UID: "5e77349a-c322-4ed1-ba26-8d952f277ba2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.074088 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-scripts" (OuterVolumeSpecName: "scripts") pod "5e77349a-c322-4ed1-ba26-8d952f277ba2" (UID: "5e77349a-c322-4ed1-ba26-8d952f277ba2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.075425 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e77349a-c322-4ed1-ba26-8d952f277ba2-kube-api-access-vg5jf" (OuterVolumeSpecName: "kube-api-access-vg5jf") pod "5e77349a-c322-4ed1-ba26-8d952f277ba2" (UID: "5e77349a-c322-4ed1-ba26-8d952f277ba2"). InnerVolumeSpecName "kube-api-access-vg5jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.081016 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b4c17192-39ae-4520-aed0-f6325410b6b6" (UID: "b4c17192-39ae-4520-aed0-f6325410b6b6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.163302 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grbjv\" (UniqueName: \"kubernetes.io/projected/b4c17192-39ae-4520-aed0-f6325410b6b6-kube-api-access-grbjv\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.163546 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg5jf\" (UniqueName: \"kubernetes.io/projected/5e77349a-c322-4ed1-ba26-8d952f277ba2-kube-api-access-vg5jf\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.163752 4788 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.163814 4788 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.163867 4788 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5e77349a-c322-4ed1-ba26-8d952f277ba2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.163919 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.173275 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5e77349a-c322-4ed1-ba26-8d952f277ba2" (UID: "5e77349a-c322-4ed1-ba26-8d952f277ba2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.232124 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e77349a-c322-4ed1-ba26-8d952f277ba2" (UID: "5e77349a-c322-4ed1-ba26-8d952f277ba2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.251379 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4c17192-39ae-4520-aed0-f6325410b6b6" (UID: "b4c17192-39ae-4520-aed0-f6325410b6b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.267195 4788 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.267232 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.267284 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.281650 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-config" (OuterVolumeSpecName: "config") pod "b4c17192-39ae-4520-aed0-f6325410b6b6" (UID: "b4c17192-39ae-4520-aed0-f6325410b6b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.296374 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b4c17192-39ae-4520-aed0-f6325410b6b6" (UID: "b4c17192-39ae-4520-aed0-f6325410b6b6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.347387 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-config-data" (OuterVolumeSpecName: "config-data") pod "5e77349a-c322-4ed1-ba26-8d952f277ba2" (UID: "5e77349a-c322-4ed1-ba26-8d952f277ba2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.372977 4788 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.373020 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e77349a-c322-4ed1-ba26-8d952f277ba2-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.373033 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b4c17192-39ae-4520-aed0-f6325410b6b6-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.499419 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.499657 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5e77349a-c322-4ed1-ba26-8d952f277ba2","Type":"ContainerDied","Data":"450c715c81711600c15286192e3d13329d9828d9ee8bc132208de3721be34c4c"} Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.499904 4788 scope.go:117] "RemoveContainer" containerID="15b4aa9bba267916367e901ef1b5ea45c574c8435626778143f28416523c9cc1" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.502529 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9e924ad6-b71b-400f-af06-406e9e2341e4","Type":"ContainerDied","Data":"2faf4cc09189aa3acf1b150c761a85ad8c7cbb7a0686350394512ae843a5935f"} Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.502661 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.505897 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c6b98d796-z866s" event={"ID":"b4c17192-39ae-4520-aed0-f6325410b6b6","Type":"ContainerDied","Data":"60ae340d34eeab7c9131ac2a059c463b12af83909f747d143ee86e7f0f5d835d"} Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.505972 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c6b98d796-z866s" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.515068 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"dd204842-7289-418d-a4d1-e0d079e368b3","Type":"ContainerStarted","Data":"de3ea2fbed8c0f5973a3b26b8f7a4693d6ee598686ff52c365da3a25c091034b"} Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.517233 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58576d599f-zjnz8" event={"ID":"6afa4604-515a-4774-9d5f-e641cb256988","Type":"ContainerStarted","Data":"361253e89de83cacf88131d9b5e8255b0b65d486d8c236e893d3d4c4a551f0c7"} Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.517271 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58576d599f-zjnz8" event={"ID":"6afa4604-515a-4774-9d5f-e641cb256988","Type":"ContainerStarted","Data":"55e00e2c314b1062c09324308873d0131446d04fa2d2e731a271ae07e0d06090"} Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.517697 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.517723 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.526778 4788 scope.go:117] "RemoveContainer" containerID="38d69d6e4ae8079f6c7fb2b2018bd736e0b05f2d95d8e80427ad9d01640c00c3" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.548592 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.7963888159999999 podStartE2EDuration="12.548570105s" podCreationTimestamp="2026-02-19 09:03:07 +0000 UTC" firstStartedPulling="2026-02-19 09:03:07.930880591 +0000 UTC m=+1089.918892063" lastFinishedPulling="2026-02-19 09:03:18.68306188 +0000 UTC m=+1100.671073352" observedRunningTime="2026-02-19 09:03:19.530840319 +0000 UTC m=+1101.518851811" watchObservedRunningTime="2026-02-19 09:03:19.548570105 +0000 UTC m=+1101.536581577" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.571521 4788 scope.go:117] "RemoveContainer" containerID="68444d649b15354e7a4db1446cf27e0aeecb3d6ebf05c580f1c04ac72da27060" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.594868 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.603428 4788 scope.go:117] "RemoveContainer" containerID="07b2cdcbf1b7746016e60e8636cf24f0de6f2eef452e01edf5f6672364a87a54" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.609730 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.638509 4788 scope.go:117] "RemoveContainer" containerID="dbde9939c2b7c2c5f8b1a1d7b364a0ded140ebe391ea29a5be3ed919873994cd" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.656792 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.664965 4788 scope.go:117] "RemoveContainer" containerID="093381a8bb52834eac100e41a38499bde99aaad2811427c5fe6c9df7ab179ef4" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.665873 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677332 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:03:19 crc kubenswrapper[4788]: E0219 09:03:19.677785 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="ceilometer-central-agent" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677805 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="ceilometer-central-agent" Feb 19 09:03:19 crc kubenswrapper[4788]: E0219 09:03:19.677829 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerName="neutron-api" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677837 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerName="neutron-api" Feb 19 09:03:19 crc kubenswrapper[4788]: E0219 09:03:19.677849 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="sg-core" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677856 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="sg-core" Feb 19 09:03:19 crc kubenswrapper[4788]: E0219 09:03:19.677863 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api-log" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677873 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api-log" Feb 19 09:03:19 crc kubenswrapper[4788]: E0219 09:03:19.677883 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="ceilometer-notification-agent" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677890 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="ceilometer-notification-agent" Feb 19 09:03:19 crc kubenswrapper[4788]: E0219 09:03:19.677903 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="proxy-httpd" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677910 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="proxy-httpd" Feb 19 09:03:19 crc kubenswrapper[4788]: E0219 09:03:19.677937 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerName="neutron-httpd" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677944 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerName="neutron-httpd" Feb 19 09:03:19 crc kubenswrapper[4788]: E0219 09:03:19.677968 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.677976 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.678175 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.678194 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerName="neutron-httpd" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.678205 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" containerName="cinder-api-log" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.678216 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="ceilometer-central-agent" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.678231 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" containerName="neutron-api" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.678260 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="proxy-httpd" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.678271 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="sg-core" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.678295 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" containerName="ceilometer-notification-agent" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.679293 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.682414 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.682827 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.683234 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.686472 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.690768 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.692905 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.694939 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.699659 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.703604 4788 scope.go:117] "RemoveContainer" containerID="2644ab20c671b994a5162da9472b9b4f0b23642d7e291858db5645285635078c" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.706221 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-58576d599f-zjnz8" podStartSLOduration=8.70620362 podStartE2EDuration="8.70620362s" podCreationTimestamp="2026-02-19 09:03:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:19.596717161 +0000 UTC m=+1101.584728643" watchObservedRunningTime="2026-02-19 09:03:19.70620362 +0000 UTC m=+1101.694215092" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.718391 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.729813 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c6b98d796-z866s"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.730902 4788 scope.go:117] "RemoveContainer" containerID="5286348298ae9e5284ff477c31c3d706f166c1dcd16b4e678a3f30e44509763c" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.738006 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c6b98d796-z866s"] Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.781508 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4897501-a017-443d-ac1c-08a9e23629b5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.781567 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-scripts\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.781610 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrl8\" (UniqueName: \"kubernetes.io/projected/e4897501-a017-443d-ac1c-08a9e23629b5-kube-api-access-vkrl8\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.781632 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.781657 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.781745 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.781872 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4897501-a017-443d-ac1c-08a9e23629b5-logs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.781939 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-config-data\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.782124 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884225 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4897501-a017-443d-ac1c-08a9e23629b5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884359 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-scripts\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884385 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-scripts\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884402 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4897501-a017-443d-ac1c-08a9e23629b5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884429 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrl8\" (UniqueName: \"kubernetes.io/projected/e4897501-a017-443d-ac1c-08a9e23629b5-kube-api-access-vkrl8\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884672 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-log-httpd\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884695 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884735 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884845 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884871 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-config-data\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884919 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.884960 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4897501-a017-443d-ac1c-08a9e23629b5-logs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.885000 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-config-data\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.885046 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf9t9\" (UniqueName: \"kubernetes.io/projected/ef723535-d46b-46be-a561-cedf85829157-kube-api-access-zf9t9\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.885128 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-run-httpd\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.885236 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.885322 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.886355 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4897501-a017-443d-ac1c-08a9e23629b5-logs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.888741 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.889763 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.890899 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-scripts\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.890991 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.891091 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.891536 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4897501-a017-443d-ac1c-08a9e23629b5-config-data\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.901682 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrl8\" (UniqueName: \"kubernetes.io/projected/e4897501-a017-443d-ac1c-08a9e23629b5-kube-api-access-vkrl8\") pod \"cinder-api-0\" (UID: \"e4897501-a017-443d-ac1c-08a9e23629b5\") " pod="openstack/cinder-api-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.986978 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf9t9\" (UniqueName: \"kubernetes.io/projected/ef723535-d46b-46be-a561-cedf85829157-kube-api-access-zf9t9\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.987239 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-run-httpd\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.987448 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.987550 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-scripts\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.987628 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-log-httpd\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.987736 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.987801 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-config-data\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.989370 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-log-httpd\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.987770 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-run-httpd\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.991350 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-scripts\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.991522 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-config-data\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.992797 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:19 crc kubenswrapper[4788]: I0219 09:03:19.992915 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.004002 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf9t9\" (UniqueName: \"kubernetes.io/projected/ef723535-d46b-46be-a561-cedf85829157-kube-api-access-zf9t9\") pod \"ceilometer-0\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " pod="openstack/ceilometer-0" Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.017414 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.021685 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.463979 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.544522 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4897501-a017-443d-ac1c-08a9e23629b5","Type":"ContainerStarted","Data":"1071cee3db7d560fedf38e9785b4ef1c6041b63835fd9a03e00797c3adc3be9a"} Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.557682 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.726836 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e77349a-c322-4ed1-ba26-8d952f277ba2" path="/var/lib/kubelet/pods/5e77349a-c322-4ed1-ba26-8d952f277ba2/volumes" Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.727705 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e924ad6-b71b-400f-af06-406e9e2341e4" path="/var/lib/kubelet/pods/9e924ad6-b71b-400f-af06-406e9e2341e4/volumes" Feb 19 09:03:20 crc kubenswrapper[4788]: I0219 09:03:20.728547 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c17192-39ae-4520-aed0-f6325410b6b6" path="/var/lib/kubelet/pods/b4c17192-39ae-4520-aed0-f6325410b6b6/volumes" Feb 19 09:03:21 crc kubenswrapper[4788]: I0219 09:03:21.174546 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:21 crc kubenswrapper[4788]: I0219 09:03:21.587963 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4897501-a017-443d-ac1c-08a9e23629b5","Type":"ContainerStarted","Data":"0c8776e6855b4cc22e0f33b39773e7e559c064014933b0f2192e74984ff727e6"} Feb 19 09:03:21 crc kubenswrapper[4788]: I0219 09:03:21.590890 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerStarted","Data":"51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4"} Feb 19 09:03:21 crc kubenswrapper[4788]: I0219 09:03:21.590920 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerStarted","Data":"5dbfdc8e3bfc2aa620154322ebcedef771d1d62269bb7d96c1a50563f2b08382"} Feb 19 09:03:21 crc kubenswrapper[4788]: I0219 09:03:21.699644 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:03:21 crc kubenswrapper[4788]: I0219 09:03:21.700280 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="13e91365-d18b-4977-9292-91b3f98f8469" containerName="glance-httpd" containerID="cri-o://06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37" gracePeriod=30 Feb 19 09:03:21 crc kubenswrapper[4788]: I0219 09:03:21.699918 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="13e91365-d18b-4977-9292-91b3f98f8469" containerName="glance-log" containerID="cri-o://44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5" gracePeriod=30 Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.139613 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.140121 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.481428 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-cshbz"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.483929 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.517308 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cshbz"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.584104 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d128-account-create-update-8ptfd"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.585833 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.589378 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.605712 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qvgbw"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.606834 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.616917 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4897501-a017-443d-ac1c-08a9e23629b5","Type":"ContainerStarted","Data":"2e139977d45905e5d0df9eb3cd67ee1bbd76ae37fbc7c57f89e56ad59c00bab8"} Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.617049 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.629680 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d128-account-create-update-8ptfd"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.629746 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerStarted","Data":"31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a"} Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.648454 4788 generic.go:334] "Generic (PLEG): container finished" podID="13e91365-d18b-4977-9292-91b3f98f8469" containerID="44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5" exitCode=143 Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.648678 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13e91365-d18b-4977-9292-91b3f98f8469","Type":"ContainerDied","Data":"44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5"} Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.650303 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bcd5c9-1256-4778-b245-3f19bc742903-operator-scripts\") pod \"nova-cell0-db-create-qvgbw\" (UID: \"15bcd5c9-1256-4778-b245-3f19bc742903\") " pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.650429 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbc3972e-f29c-430c-9da0-29f51a8e6a47-operator-scripts\") pod \"nova-api-d128-account-create-update-8ptfd\" (UID: \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\") " pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.650540 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0bc66-750b-4618-bd07-033c189eafcf-operator-scripts\") pod \"nova-api-db-create-cshbz\" (UID: \"43a0bc66-750b-4618-bd07-033c189eafcf\") " pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.650627 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vphnm\" (UniqueName: \"kubernetes.io/projected/bbc3972e-f29c-430c-9da0-29f51a8e6a47-kube-api-access-vphnm\") pod \"nova-api-d128-account-create-update-8ptfd\" (UID: \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\") " pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.650715 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5nx\" (UniqueName: \"kubernetes.io/projected/43a0bc66-750b-4618-bd07-033c189eafcf-kube-api-access-gx5nx\") pod \"nova-api-db-create-cshbz\" (UID: \"43a0bc66-750b-4618-bd07-033c189eafcf\") " pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.650833 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwtl6\" (UniqueName: \"kubernetes.io/projected/15bcd5c9-1256-4778-b245-3f19bc742903-kube-api-access-zwtl6\") pod \"nova-cell0-db-create-qvgbw\" (UID: \"15bcd5c9-1256-4778-b245-3f19bc742903\") " pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.650308 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qvgbw"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.676157 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.676134487 podStartE2EDuration="3.676134487s" podCreationTimestamp="2026-02-19 09:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:22.648307946 +0000 UTC m=+1104.636319428" watchObservedRunningTime="2026-02-19 09:03:22.676134487 +0000 UTC m=+1104.664145959" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.753216 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bcd5c9-1256-4778-b245-3f19bc742903-operator-scripts\") pod \"nova-cell0-db-create-qvgbw\" (UID: \"15bcd5c9-1256-4778-b245-3f19bc742903\") " pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.753560 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbc3972e-f29c-430c-9da0-29f51a8e6a47-operator-scripts\") pod \"nova-api-d128-account-create-update-8ptfd\" (UID: \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\") " pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.753649 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0bc66-750b-4618-bd07-033c189eafcf-operator-scripts\") pod \"nova-api-db-create-cshbz\" (UID: \"43a0bc66-750b-4618-bd07-033c189eafcf\") " pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.753748 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vphnm\" (UniqueName: \"kubernetes.io/projected/bbc3972e-f29c-430c-9da0-29f51a8e6a47-kube-api-access-vphnm\") pod \"nova-api-d128-account-create-update-8ptfd\" (UID: \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\") " pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.753832 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5nx\" (UniqueName: \"kubernetes.io/projected/43a0bc66-750b-4618-bd07-033c189eafcf-kube-api-access-gx5nx\") pod \"nova-api-db-create-cshbz\" (UID: \"43a0bc66-750b-4618-bd07-033c189eafcf\") " pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.753936 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwtl6\" (UniqueName: \"kubernetes.io/projected/15bcd5c9-1256-4778-b245-3f19bc742903-kube-api-access-zwtl6\") pod \"nova-cell0-db-create-qvgbw\" (UID: \"15bcd5c9-1256-4778-b245-3f19bc742903\") " pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.754881 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0bc66-750b-4618-bd07-033c189eafcf-operator-scripts\") pod \"nova-api-db-create-cshbz\" (UID: \"43a0bc66-750b-4618-bd07-033c189eafcf\") " pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.756201 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bcd5c9-1256-4778-b245-3f19bc742903-operator-scripts\") pod \"nova-cell0-db-create-qvgbw\" (UID: \"15bcd5c9-1256-4778-b245-3f19bc742903\") " pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.756354 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbc3972e-f29c-430c-9da0-29f51a8e6a47-operator-scripts\") pod \"nova-api-d128-account-create-update-8ptfd\" (UID: \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\") " pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.789914 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vphnm\" (UniqueName: \"kubernetes.io/projected/bbc3972e-f29c-430c-9da0-29f51a8e6a47-kube-api-access-vphnm\") pod \"nova-api-d128-account-create-update-8ptfd\" (UID: \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\") " pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.790758 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwtl6\" (UniqueName: \"kubernetes.io/projected/15bcd5c9-1256-4778-b245-3f19bc742903-kube-api-access-zwtl6\") pod \"nova-cell0-db-create-qvgbw\" (UID: \"15bcd5c9-1256-4778-b245-3f19bc742903\") " pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.809036 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-s7cvf"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.810104 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5nx\" (UniqueName: \"kubernetes.io/projected/43a0bc66-750b-4618-bd07-033c189eafcf-kube-api-access-gx5nx\") pod \"nova-api-db-create-cshbz\" (UID: \"43a0bc66-750b-4618-bd07-033c189eafcf\") " pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.810407 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.821120 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.838403 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.838796 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" containerName="glance-log" containerID="cri-o://6946745df3ff4febdf38cfda55b437589b22c84ad790a5b24c86aa802cad8ddf" gracePeriod=30 Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.839376 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" containerName="glance-httpd" containerID="cri-o://84ffdaa68b4ec177038b46443924be2446a2c21907fb89afa57c772f8ed8be25" gracePeriod=30 Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.887355 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9025-account-create-update-nrk5c"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.889778 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.892313 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.917039 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s7cvf"] Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.919423 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.943783 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.963612 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc9gc\" (UniqueName: \"kubernetes.io/projected/db2289eb-977d-42f1-a70a-772737cc197a-kube-api-access-gc9gc\") pod \"nova-cell1-db-create-s7cvf\" (UID: \"db2289eb-977d-42f1-a70a-772737cc197a\") " pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.963693 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db2289eb-977d-42f1-a70a-772737cc197a-operator-scripts\") pod \"nova-cell1-db-create-s7cvf\" (UID: \"db2289eb-977d-42f1-a70a-772737cc197a\") " pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:22 crc kubenswrapper[4788]: I0219 09:03:22.963832 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9025-account-create-update-nrk5c"] Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.002558 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6d1f-account-create-update-sxrgh"] Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.004076 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.007589 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.013454 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6d1f-account-create-update-sxrgh"] Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.065764 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vwzd\" (UniqueName: \"kubernetes.io/projected/2e0102d8-4abf-499f-bfe8-149ace187639-kube-api-access-8vwzd\") pod \"nova-cell0-9025-account-create-update-nrk5c\" (UID: \"2e0102d8-4abf-499f-bfe8-149ace187639\") " pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.065868 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc9gc\" (UniqueName: \"kubernetes.io/projected/db2289eb-977d-42f1-a70a-772737cc197a-kube-api-access-gc9gc\") pod \"nova-cell1-db-create-s7cvf\" (UID: \"db2289eb-977d-42f1-a70a-772737cc197a\") " pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.065924 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db2289eb-977d-42f1-a70a-772737cc197a-operator-scripts\") pod \"nova-cell1-db-create-s7cvf\" (UID: \"db2289eb-977d-42f1-a70a-772737cc197a\") " pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.065980 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0102d8-4abf-499f-bfe8-149ace187639-operator-scripts\") pod \"nova-cell0-9025-account-create-update-nrk5c\" (UID: \"2e0102d8-4abf-499f-bfe8-149ace187639\") " pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.068911 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db2289eb-977d-42f1-a70a-772737cc197a-operator-scripts\") pod \"nova-cell1-db-create-s7cvf\" (UID: \"db2289eb-977d-42f1-a70a-772737cc197a\") " pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.102264 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc9gc\" (UniqueName: \"kubernetes.io/projected/db2289eb-977d-42f1-a70a-772737cc197a-kube-api-access-gc9gc\") pod \"nova-cell1-db-create-s7cvf\" (UID: \"db2289eb-977d-42f1-a70a-772737cc197a\") " pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.170074 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vwzd\" (UniqueName: \"kubernetes.io/projected/2e0102d8-4abf-499f-bfe8-149ace187639-kube-api-access-8vwzd\") pod \"nova-cell0-9025-account-create-update-nrk5c\" (UID: \"2e0102d8-4abf-499f-bfe8-149ace187639\") " pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.170674 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0102d8-4abf-499f-bfe8-149ace187639-operator-scripts\") pod \"nova-cell0-9025-account-create-update-nrk5c\" (UID: \"2e0102d8-4abf-499f-bfe8-149ace187639\") " pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.170710 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fe20ca-6454-4f38-90ab-16facbb9fb53-operator-scripts\") pod \"nova-cell1-6d1f-account-create-update-sxrgh\" (UID: \"60fe20ca-6454-4f38-90ab-16facbb9fb53\") " pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.170757 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggbvt\" (UniqueName: \"kubernetes.io/projected/60fe20ca-6454-4f38-90ab-16facbb9fb53-kube-api-access-ggbvt\") pod \"nova-cell1-6d1f-account-create-update-sxrgh\" (UID: \"60fe20ca-6454-4f38-90ab-16facbb9fb53\") " pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.172976 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0102d8-4abf-499f-bfe8-149ace187639-operator-scripts\") pod \"nova-cell0-9025-account-create-update-nrk5c\" (UID: \"2e0102d8-4abf-499f-bfe8-149ace187639\") " pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.194374 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vwzd\" (UniqueName: \"kubernetes.io/projected/2e0102d8-4abf-499f-bfe8-149ace187639-kube-api-access-8vwzd\") pod \"nova-cell0-9025-account-create-update-nrk5c\" (UID: \"2e0102d8-4abf-499f-bfe8-149ace187639\") " pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.216814 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.250122 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.272606 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fe20ca-6454-4f38-90ab-16facbb9fb53-operator-scripts\") pod \"nova-cell1-6d1f-account-create-update-sxrgh\" (UID: \"60fe20ca-6454-4f38-90ab-16facbb9fb53\") " pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.272696 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggbvt\" (UniqueName: \"kubernetes.io/projected/60fe20ca-6454-4f38-90ab-16facbb9fb53-kube-api-access-ggbvt\") pod \"nova-cell1-6d1f-account-create-update-sxrgh\" (UID: \"60fe20ca-6454-4f38-90ab-16facbb9fb53\") " pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.273762 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fe20ca-6454-4f38-90ab-16facbb9fb53-operator-scripts\") pod \"nova-cell1-6d1f-account-create-update-sxrgh\" (UID: \"60fe20ca-6454-4f38-90ab-16facbb9fb53\") " pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.290651 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggbvt\" (UniqueName: \"kubernetes.io/projected/60fe20ca-6454-4f38-90ab-16facbb9fb53-kube-api-access-ggbvt\") pod \"nova-cell1-6d1f-account-create-update-sxrgh\" (UID: \"60fe20ca-6454-4f38-90ab-16facbb9fb53\") " pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.360695 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.422558 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cshbz"] Feb 19 09:03:23 crc kubenswrapper[4788]: W0219 09:03:23.434175 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43a0bc66_750b_4618_bd07_033c189eafcf.slice/crio-ec686f5bc5b153ad7559aca2fc578796b517bd89928bc4205205bd98cf87eb54 WatchSource:0}: Error finding container ec686f5bc5b153ad7559aca2fc578796b517bd89928bc4205205bd98cf87eb54: Status 404 returned error can't find the container with id ec686f5bc5b153ad7559aca2fc578796b517bd89928bc4205205bd98cf87eb54 Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.547428 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qvgbw"] Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.563551 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d128-account-create-update-8ptfd"] Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.668275 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d128-account-create-update-8ptfd" event={"ID":"bbc3972e-f29c-430c-9da0-29f51a8e6a47","Type":"ContainerStarted","Data":"5cf6d4a7daf5a844127be979cb99e4ae61990771ca1b589d65223dca2a9445b9"} Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.674814 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cshbz" event={"ID":"43a0bc66-750b-4618-bd07-033c189eafcf","Type":"ContainerStarted","Data":"ec686f5bc5b153ad7559aca2fc578796b517bd89928bc4205205bd98cf87eb54"} Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.679750 4788 generic.go:334] "Generic (PLEG): container finished" podID="855949a4-e027-44b4-8705-202c74c3ffdb" containerID="6946745df3ff4febdf38cfda55b437589b22c84ad790a5b24c86aa802cad8ddf" exitCode=143 Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.679814 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"855949a4-e027-44b4-8705-202c74c3ffdb","Type":"ContainerDied","Data":"6946745df3ff4febdf38cfda55b437589b22c84ad790a5b24c86aa802cad8ddf"} Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.685565 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qvgbw" event={"ID":"15bcd5c9-1256-4778-b245-3f19bc742903","Type":"ContainerStarted","Data":"e291c2b34ad07f3c91a3268b8d529d0f034357a8a7f18158e6817cb18cd630fd"} Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.698490 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerStarted","Data":"1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a"} Feb 19 09:03:23 crc kubenswrapper[4788]: W0219 09:03:23.793292 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e0102d8_4abf_499f_bfe8_149ace187639.slice/crio-a94fc4bf601ea12e76e30436f728555ebed98dc809e97eecc9601d6d751f36d9 WatchSource:0}: Error finding container a94fc4bf601ea12e76e30436f728555ebed98dc809e97eecc9601d6d751f36d9: Status 404 returned error can't find the container with id a94fc4bf601ea12e76e30436f728555ebed98dc809e97eecc9601d6d751f36d9 Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.802914 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9025-account-create-update-nrk5c"] Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.834597 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-s7cvf"] Feb 19 09:03:23 crc kubenswrapper[4788]: I0219 09:03:23.930827 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6d1f-account-create-update-sxrgh"] Feb 19 09:03:23 crc kubenswrapper[4788]: W0219 09:03:23.936396 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60fe20ca_6454_4f38_90ab_16facbb9fb53.slice/crio-ef9a92f07aeb58c44cb4bf65a5486c1a87699d6c6178028ecb668a1c69ad709e WatchSource:0}: Error finding container ef9a92f07aeb58c44cb4bf65a5486c1a87699d6c6178028ecb668a1c69ad709e: Status 404 returned error can't find the container with id ef9a92f07aeb58c44cb4bf65a5486c1a87699d6c6178028ecb668a1c69ad709e Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.728605 4788 generic.go:334] "Generic (PLEG): container finished" podID="db2289eb-977d-42f1-a70a-772737cc197a" containerID="9aafee9d4dccba11500c2598bea95a81d3ffd40a43dd81ab4d4a24604be220f5" exitCode=0 Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.734851 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s7cvf" event={"ID":"db2289eb-977d-42f1-a70a-772737cc197a","Type":"ContainerDied","Data":"9aafee9d4dccba11500c2598bea95a81d3ffd40a43dd81ab4d4a24604be220f5"} Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.734911 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s7cvf" event={"ID":"db2289eb-977d-42f1-a70a-772737cc197a","Type":"ContainerStarted","Data":"fe17720a957772033812087317d41d11249db3d725b5d188925fd05758c36efb"} Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.735180 4788 generic.go:334] "Generic (PLEG): container finished" podID="15bcd5c9-1256-4778-b245-3f19bc742903" containerID="5277c64d75c0cf2ad2f41d6b4441d76bf14236763e566f1da28ab2099fc982cb" exitCode=0 Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.735289 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qvgbw" event={"ID":"15bcd5c9-1256-4778-b245-3f19bc742903","Type":"ContainerDied","Data":"5277c64d75c0cf2ad2f41d6b4441d76bf14236763e566f1da28ab2099fc982cb"} Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.748001 4788 generic.go:334] "Generic (PLEG): container finished" podID="2e0102d8-4abf-499f-bfe8-149ace187639" containerID="645545c3d44a325aabfedef594b111d4f4f199768422929f115ef56de5621ed0" exitCode=0 Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.748072 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9025-account-create-update-nrk5c" event={"ID":"2e0102d8-4abf-499f-bfe8-149ace187639","Type":"ContainerDied","Data":"645545c3d44a325aabfedef594b111d4f4f199768422929f115ef56de5621ed0"} Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.748101 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9025-account-create-update-nrk5c" event={"ID":"2e0102d8-4abf-499f-bfe8-149ace187639","Type":"ContainerStarted","Data":"a94fc4bf601ea12e76e30436f728555ebed98dc809e97eecc9601d6d751f36d9"} Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.767892 4788 generic.go:334] "Generic (PLEG): container finished" podID="60fe20ca-6454-4f38-90ab-16facbb9fb53" containerID="5730e69c134dd7e97b3782f5cdf06025a8ece910a225f19f11d56b280425847b" exitCode=0 Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.767980 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" event={"ID":"60fe20ca-6454-4f38-90ab-16facbb9fb53","Type":"ContainerDied","Data":"5730e69c134dd7e97b3782f5cdf06025a8ece910a225f19f11d56b280425847b"} Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.768014 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" event={"ID":"60fe20ca-6454-4f38-90ab-16facbb9fb53","Type":"ContainerStarted","Data":"ef9a92f07aeb58c44cb4bf65a5486c1a87699d6c6178028ecb668a1c69ad709e"} Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.770875 4788 generic.go:334] "Generic (PLEG): container finished" podID="bbc3972e-f29c-430c-9da0-29f51a8e6a47" containerID="a708a2d05af63e70ad0b55420f6da7bf33320ae6c612e91a990ec7a496d097d5" exitCode=0 Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.770936 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d128-account-create-update-8ptfd" event={"ID":"bbc3972e-f29c-430c-9da0-29f51a8e6a47","Type":"ContainerDied","Data":"a708a2d05af63e70ad0b55420f6da7bf33320ae6c612e91a990ec7a496d097d5"} Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.775340 4788 generic.go:334] "Generic (PLEG): container finished" podID="43a0bc66-750b-4618-bd07-033c189eafcf" containerID="ab0295045488273318b481d0a14297364c23bb2062591a3491f59dc6b17a95e1" exitCode=0 Feb 19 09:03:24 crc kubenswrapper[4788]: I0219 09:03:24.775390 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cshbz" event={"ID":"43a0bc66-750b-4618-bd07-033c189eafcf","Type":"ContainerDied","Data":"ab0295045488273318b481d0a14297364c23bb2062591a3491f59dc6b17a95e1"} Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.506562 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.625287 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-logs\") pod \"13e91365-d18b-4977-9292-91b3f98f8469\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.625371 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-config-data\") pod \"13e91365-d18b-4977-9292-91b3f98f8469\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.625478 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"13e91365-d18b-4977-9292-91b3f98f8469\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.625500 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-scripts\") pod \"13e91365-d18b-4977-9292-91b3f98f8469\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.625540 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-combined-ca-bundle\") pod \"13e91365-d18b-4977-9292-91b3f98f8469\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.625573 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cl2z\" (UniqueName: \"kubernetes.io/projected/13e91365-d18b-4977-9292-91b3f98f8469-kube-api-access-2cl2z\") pod \"13e91365-d18b-4977-9292-91b3f98f8469\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.625593 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-public-tls-certs\") pod \"13e91365-d18b-4977-9292-91b3f98f8469\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.625649 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-httpd-run\") pod \"13e91365-d18b-4977-9292-91b3f98f8469\" (UID: \"13e91365-d18b-4977-9292-91b3f98f8469\") " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.626591 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "13e91365-d18b-4977-9292-91b3f98f8469" (UID: "13e91365-d18b-4977-9292-91b3f98f8469"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.626845 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-logs" (OuterVolumeSpecName: "logs") pod "13e91365-d18b-4977-9292-91b3f98f8469" (UID: "13e91365-d18b-4977-9292-91b3f98f8469"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.644523 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-scripts" (OuterVolumeSpecName: "scripts") pod "13e91365-d18b-4977-9292-91b3f98f8469" (UID: "13e91365-d18b-4977-9292-91b3f98f8469"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.646415 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13e91365-d18b-4977-9292-91b3f98f8469-kube-api-access-2cl2z" (OuterVolumeSpecName: "kube-api-access-2cl2z") pod "13e91365-d18b-4977-9292-91b3f98f8469" (UID: "13e91365-d18b-4977-9292-91b3f98f8469"). InnerVolumeSpecName "kube-api-access-2cl2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.647467 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "13e91365-d18b-4977-9292-91b3f98f8469" (UID: "13e91365-d18b-4977-9292-91b3f98f8469"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.687640 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13e91365-d18b-4977-9292-91b3f98f8469" (UID: "13e91365-d18b-4977-9292-91b3f98f8469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.695034 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-config-data" (OuterVolumeSpecName: "config-data") pod "13e91365-d18b-4977-9292-91b3f98f8469" (UID: "13e91365-d18b-4977-9292-91b3f98f8469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.702866 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "13e91365-d18b-4977-9292-91b3f98f8469" (UID: "13e91365-d18b-4977-9292-91b3f98f8469"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.728725 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.728770 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.728837 4788 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.728850 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.728890 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.728906 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cl2z\" (UniqueName: \"kubernetes.io/projected/13e91365-d18b-4977-9292-91b3f98f8469-kube-api-access-2cl2z\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.728923 4788 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/13e91365-d18b-4977-9292-91b3f98f8469-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.728935 4788 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/13e91365-d18b-4977-9292-91b3f98f8469-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.754442 4788 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.784870 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerStarted","Data":"0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c"} Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.784999 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="ceilometer-central-agent" containerID="cri-o://51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4" gracePeriod=30 Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.785119 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="proxy-httpd" containerID="cri-o://0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c" gracePeriod=30 Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.785172 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="sg-core" containerID="cri-o://1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a" gracePeriod=30 Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.785209 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="ceilometer-notification-agent" containerID="cri-o://31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a" gracePeriod=30 Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.785017 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.793115 4788 generic.go:334] "Generic (PLEG): container finished" podID="13e91365-d18b-4977-9292-91b3f98f8469" containerID="06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37" exitCode=0 Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.793371 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.796160 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13e91365-d18b-4977-9292-91b3f98f8469","Type":"ContainerDied","Data":"06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37"} Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.796207 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"13e91365-d18b-4977-9292-91b3f98f8469","Type":"ContainerDied","Data":"8b02d85966c2a35277ddfe31c0eca403fd8d92c188bf71748b66a40913185ef5"} Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.796228 4788 scope.go:117] "RemoveContainer" containerID="06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.820669 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.482964529 podStartE2EDuration="6.820645858s" podCreationTimestamp="2026-02-19 09:03:19 +0000 UTC" firstStartedPulling="2026-02-19 09:03:20.582131794 +0000 UTC m=+1102.570143266" lastFinishedPulling="2026-02-19 09:03:24.919813123 +0000 UTC m=+1106.907824595" observedRunningTime="2026-02-19 09:03:25.806737823 +0000 UTC m=+1107.794749305" watchObservedRunningTime="2026-02-19 09:03:25.820645858 +0000 UTC m=+1107.808657340" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.830825 4788 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.873841 4788 scope.go:117] "RemoveContainer" containerID="44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.883562 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.892752 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.916019 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:03:25 crc kubenswrapper[4788]: E0219 09:03:25.916518 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e91365-d18b-4977-9292-91b3f98f8469" containerName="glance-httpd" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.916538 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e91365-d18b-4977-9292-91b3f98f8469" containerName="glance-httpd" Feb 19 09:03:25 crc kubenswrapper[4788]: E0219 09:03:25.916612 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e91365-d18b-4977-9292-91b3f98f8469" containerName="glance-log" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.916624 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e91365-d18b-4977-9292-91b3f98f8469" containerName="glance-log" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.916834 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="13e91365-d18b-4977-9292-91b3f98f8469" containerName="glance-log" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.916855 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="13e91365-d18b-4977-9292-91b3f98f8469" containerName="glance-httpd" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.918014 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.930002 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.930352 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.953348 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.989474 4788 scope.go:117] "RemoveContainer" containerID="06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37" Feb 19 09:03:25 crc kubenswrapper[4788]: E0219 09:03:25.990173 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37\": container with ID starting with 06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37 not found: ID does not exist" containerID="06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.990207 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37"} err="failed to get container status \"06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37\": rpc error: code = NotFound desc = could not find container \"06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37\": container with ID starting with 06ebcb4739c04bf3f053a2f6d7152ed27ebff9a8ccc95971ed630fc43ebc6a37 not found: ID does not exist" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.990229 4788 scope.go:117] "RemoveContainer" containerID="44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5" Feb 19 09:03:25 crc kubenswrapper[4788]: E0219 09:03:25.995197 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5\": container with ID starting with 44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5 not found: ID does not exist" containerID="44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5" Feb 19 09:03:25 crc kubenswrapper[4788]: I0219 09:03:25.995450 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5"} err="failed to get container status \"44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5\": rpc error: code = NotFound desc = could not find container \"44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5\": container with ID starting with 44ee04241dfd3f510172084ed9b7be0c67585dbb2842add98bf942a351a83cb5 not found: ID does not exist" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.034423 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-config-data\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.034479 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/850e55a0-6179-423a-8698-ae1f87b8c049-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.034543 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-scripts\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.034571 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/850e55a0-6179-423a-8698-ae1f87b8c049-logs\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.034633 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.034671 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdnpz\" (UniqueName: \"kubernetes.io/projected/850e55a0-6179-423a-8698-ae1f87b8c049-kube-api-access-qdnpz\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.034774 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.034817 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.136686 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-scripts\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137048 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/850e55a0-6179-423a-8698-ae1f87b8c049-logs\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137090 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137117 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdnpz\" (UniqueName: \"kubernetes.io/projected/850e55a0-6179-423a-8698-ae1f87b8c049-kube-api-access-qdnpz\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137233 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137364 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137457 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-config-data\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137484 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/850e55a0-6179-423a-8698-ae1f87b8c049-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137601 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/850e55a0-6179-423a-8698-ae1f87b8c049-logs\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.137962 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/850e55a0-6179-423a-8698-ae1f87b8c049-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.138076 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.142676 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.143532 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.149951 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-scripts\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.157516 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/850e55a0-6179-423a-8698-ae1f87b8c049-config-data\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.173255 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdnpz\" (UniqueName: \"kubernetes.io/projected/850e55a0-6179-423a-8698-ae1f87b8c049-kube-api-access-qdnpz\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.182122 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"850e55a0-6179-423a-8698-ae1f87b8c049\") " pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.209779 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.273356 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.341063 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx5nx\" (UniqueName: \"kubernetes.io/projected/43a0bc66-750b-4618-bd07-033c189eafcf-kube-api-access-gx5nx\") pod \"43a0bc66-750b-4618-bd07-033c189eafcf\" (UID: \"43a0bc66-750b-4618-bd07-033c189eafcf\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.341294 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0bc66-750b-4618-bd07-033c189eafcf-operator-scripts\") pod \"43a0bc66-750b-4618-bd07-033c189eafcf\" (UID: \"43a0bc66-750b-4618-bd07-033c189eafcf\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.342629 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a0bc66-750b-4618-bd07-033c189eafcf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43a0bc66-750b-4618-bd07-033c189eafcf" (UID: "43a0bc66-750b-4618-bd07-033c189eafcf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.346295 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a0bc66-750b-4618-bd07-033c189eafcf-kube-api-access-gx5nx" (OuterVolumeSpecName: "kube-api-access-gx5nx") pod "43a0bc66-750b-4618-bd07-033c189eafcf" (UID: "43a0bc66-750b-4618-bd07-033c189eafcf"). InnerVolumeSpecName "kube-api-access-gx5nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.385174 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.408829 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.443895 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vphnm\" (UniqueName: \"kubernetes.io/projected/bbc3972e-f29c-430c-9da0-29f51a8e6a47-kube-api-access-vphnm\") pod \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\" (UID: \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.443969 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbc3972e-f29c-430c-9da0-29f51a8e6a47-operator-scripts\") pod \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\" (UID: \"bbc3972e-f29c-430c-9da0-29f51a8e6a47\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.444078 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vwzd\" (UniqueName: \"kubernetes.io/projected/2e0102d8-4abf-499f-bfe8-149ace187639-kube-api-access-8vwzd\") pod \"2e0102d8-4abf-499f-bfe8-149ace187639\" (UID: \"2e0102d8-4abf-499f-bfe8-149ace187639\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.444601 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx5nx\" (UniqueName: \"kubernetes.io/projected/43a0bc66-750b-4618-bd07-033c189eafcf-kube-api-access-gx5nx\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.444624 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43a0bc66-750b-4618-bd07-033c189eafcf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.444770 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbc3972e-f29c-430c-9da0-29f51a8e6a47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbc3972e-f29c-430c-9da0-29f51a8e6a47" (UID: "bbc3972e-f29c-430c-9da0-29f51a8e6a47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.458296 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc3972e-f29c-430c-9da0-29f51a8e6a47-kube-api-access-vphnm" (OuterVolumeSpecName: "kube-api-access-vphnm") pod "bbc3972e-f29c-430c-9da0-29f51a8e6a47" (UID: "bbc3972e-f29c-430c-9da0-29f51a8e6a47"). InnerVolumeSpecName "kube-api-access-vphnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.459551 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0102d8-4abf-499f-bfe8-149ace187639-kube-api-access-8vwzd" (OuterVolumeSpecName: "kube-api-access-8vwzd") pod "2e0102d8-4abf-499f-bfe8-149ace187639" (UID: "2e0102d8-4abf-499f-bfe8-149ace187639"). InnerVolumeSpecName "kube-api-access-8vwzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.546668 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0102d8-4abf-499f-bfe8-149ace187639-operator-scripts\") pod \"2e0102d8-4abf-499f-bfe8-149ace187639\" (UID: \"2e0102d8-4abf-499f-bfe8-149ace187639\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.547546 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbc3972e-f29c-430c-9da0-29f51a8e6a47-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.547564 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vwzd\" (UniqueName: \"kubernetes.io/projected/2e0102d8-4abf-499f-bfe8-149ace187639-kube-api-access-8vwzd\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.547573 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vphnm\" (UniqueName: \"kubernetes.io/projected/bbc3972e-f29c-430c-9da0-29f51a8e6a47-kube-api-access-vphnm\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.548179 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e0102d8-4abf-499f-bfe8-149ace187639-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e0102d8-4abf-499f-bfe8-149ace187639" (UID: "2e0102d8-4abf-499f-bfe8-149ace187639"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.607477 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.614434 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.648068 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggbvt\" (UniqueName: \"kubernetes.io/projected/60fe20ca-6454-4f38-90ab-16facbb9fb53-kube-api-access-ggbvt\") pod \"60fe20ca-6454-4f38-90ab-16facbb9fb53\" (UID: \"60fe20ca-6454-4f38-90ab-16facbb9fb53\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.648142 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwtl6\" (UniqueName: \"kubernetes.io/projected/15bcd5c9-1256-4778-b245-3f19bc742903-kube-api-access-zwtl6\") pod \"15bcd5c9-1256-4778-b245-3f19bc742903\" (UID: \"15bcd5c9-1256-4778-b245-3f19bc742903\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.648247 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fe20ca-6454-4f38-90ab-16facbb9fb53-operator-scripts\") pod \"60fe20ca-6454-4f38-90ab-16facbb9fb53\" (UID: \"60fe20ca-6454-4f38-90ab-16facbb9fb53\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.648344 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bcd5c9-1256-4778-b245-3f19bc742903-operator-scripts\") pod \"15bcd5c9-1256-4778-b245-3f19bc742903\" (UID: \"15bcd5c9-1256-4778-b245-3f19bc742903\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.648751 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e0102d8-4abf-499f-bfe8-149ace187639-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.649234 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15bcd5c9-1256-4778-b245-3f19bc742903-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15bcd5c9-1256-4778-b245-3f19bc742903" (UID: "15bcd5c9-1256-4778-b245-3f19bc742903"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.651624 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60fe20ca-6454-4f38-90ab-16facbb9fb53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "60fe20ca-6454-4f38-90ab-16facbb9fb53" (UID: "60fe20ca-6454-4f38-90ab-16facbb9fb53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.657422 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15bcd5c9-1256-4778-b245-3f19bc742903-kube-api-access-zwtl6" (OuterVolumeSpecName: "kube-api-access-zwtl6") pod "15bcd5c9-1256-4778-b245-3f19bc742903" (UID: "15bcd5c9-1256-4778-b245-3f19bc742903"). InnerVolumeSpecName "kube-api-access-zwtl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.662699 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60fe20ca-6454-4f38-90ab-16facbb9fb53-kube-api-access-ggbvt" (OuterVolumeSpecName: "kube-api-access-ggbvt") pod "60fe20ca-6454-4f38-90ab-16facbb9fb53" (UID: "60fe20ca-6454-4f38-90ab-16facbb9fb53"). InnerVolumeSpecName "kube-api-access-ggbvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.774516 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13e91365-d18b-4977-9292-91b3f98f8469" path="/var/lib/kubelet/pods/13e91365-d18b-4977-9292-91b3f98f8469/volumes" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.788316 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/60fe20ca-6454-4f38-90ab-16facbb9fb53-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.788349 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15bcd5c9-1256-4778-b245-3f19bc742903-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.788358 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggbvt\" (UniqueName: \"kubernetes.io/projected/60fe20ca-6454-4f38-90ab-16facbb9fb53-kube-api-access-ggbvt\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.788369 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwtl6\" (UniqueName: \"kubernetes.io/projected/15bcd5c9-1256-4778-b245-3f19bc742903-kube-api-access-zwtl6\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.814731 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.825810 4788 generic.go:334] "Generic (PLEG): container finished" podID="855949a4-e027-44b4-8705-202c74c3ffdb" containerID="84ffdaa68b4ec177038b46443924be2446a2c21907fb89afa57c772f8ed8be25" exitCode=0 Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.845168 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qvgbw" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.853421 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5b7d4466df-8w62q"] Feb 19 09:03:26 crc kubenswrapper[4788]: E0219 09:03:26.853846 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fe20ca-6454-4f38-90ab-16facbb9fb53" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.853864 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fe20ca-6454-4f38-90ab-16facbb9fb53" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: E0219 09:03:26.853879 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15bcd5c9-1256-4778-b245-3f19bc742903" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.853887 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bcd5c9-1256-4778-b245-3f19bc742903" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: E0219 09:03:26.853912 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43a0bc66-750b-4618-bd07-033c189eafcf" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.853918 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a0bc66-750b-4618-bd07-033c189eafcf" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: E0219 09:03:26.853927 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc3972e-f29c-430c-9da0-29f51a8e6a47" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.853933 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc3972e-f29c-430c-9da0-29f51a8e6a47" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: E0219 09:03:26.853949 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2289eb-977d-42f1-a70a-772737cc197a" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.853957 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2289eb-977d-42f1-a70a-772737cc197a" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: E0219 09:03:26.853967 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0102d8-4abf-499f-bfe8-149ace187639" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.853973 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0102d8-4abf-499f-bfe8-149ace187639" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.854118 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc3972e-f29c-430c-9da0-29f51a8e6a47" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.865204 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.870648 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="15bcd5c9-1256-4778-b245-3f19bc742903" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.870718 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0102d8-4abf-499f-bfe8-149ace187639" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.870746 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="43a0bc66-750b-4618-bd07-033c189eafcf" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.870797 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2289eb-977d-42f1-a70a-772737cc197a" containerName="mariadb-database-create" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.870816 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="60fe20ca-6454-4f38-90ab-16facbb9fb53" containerName="mariadb-account-create-update" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.880197 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d128-account-create-update-8ptfd" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.887315 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cshbz" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.899713 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc9gc\" (UniqueName: \"kubernetes.io/projected/db2289eb-977d-42f1-a70a-772737cc197a-kube-api-access-gc9gc\") pod \"db2289eb-977d-42f1-a70a-772737cc197a\" (UID: \"db2289eb-977d-42f1-a70a-772737cc197a\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.899845 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db2289eb-977d-42f1-a70a-772737cc197a-operator-scripts\") pod \"db2289eb-977d-42f1-a70a-772737cc197a\" (UID: \"db2289eb-977d-42f1-a70a-772737cc197a\") " Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.901612 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db2289eb-977d-42f1-a70a-772737cc197a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db2289eb-977d-42f1-a70a-772737cc197a" (UID: "db2289eb-977d-42f1-a70a-772737cc197a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930617 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"855949a4-e027-44b4-8705-202c74c3ffdb","Type":"ContainerDied","Data":"84ffdaa68b4ec177038b46443924be2446a2c21907fb89afa57c772f8ed8be25"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930674 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qvgbw" event={"ID":"15bcd5c9-1256-4778-b245-3f19bc742903","Type":"ContainerDied","Data":"e291c2b34ad07f3c91a3268b8d529d0f034357a8a7f18158e6817cb18cd630fd"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930690 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e291c2b34ad07f3c91a3268b8d529d0f034357a8a7f18158e6817cb18cd630fd" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930712 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6d1f-account-create-update-sxrgh" event={"ID":"60fe20ca-6454-4f38-90ab-16facbb9fb53","Type":"ContainerDied","Data":"ef9a92f07aeb58c44cb4bf65a5486c1a87699d6c6178028ecb668a1c69ad709e"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930724 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef9a92f07aeb58c44cb4bf65a5486c1a87699d6c6178028ecb668a1c69ad709e" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930736 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d128-account-create-update-8ptfd" event={"ID":"bbc3972e-f29c-430c-9da0-29f51a8e6a47","Type":"ContainerDied","Data":"5cf6d4a7daf5a844127be979cb99e4ae61990771ca1b589d65223dca2a9445b9"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930745 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cf6d4a7daf5a844127be979cb99e4ae61990771ca1b589d65223dca2a9445b9" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930753 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cshbz" event={"ID":"43a0bc66-750b-4618-bd07-033c189eafcf","Type":"ContainerDied","Data":"ec686f5bc5b153ad7559aca2fc578796b517bd89928bc4205205bd98cf87eb54"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930763 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec686f5bc5b153ad7559aca2fc578796b517bd89928bc4205205bd98cf87eb54" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.930853 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.950412 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2289eb-977d-42f1-a70a-772737cc197a-kube-api-access-gc9gc" (OuterVolumeSpecName: "kube-api-access-gc9gc") pod "db2289eb-977d-42f1-a70a-772737cc197a" (UID: "db2289eb-977d-42f1-a70a-772737cc197a"). InnerVolumeSpecName "kube-api-access-gc9gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.950830 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.951105 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-tjzqh" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.959774 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5b7d4466df-8w62q"] Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.966300 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.966821 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-s7cvf" event={"ID":"db2289eb-977d-42f1-a70a-772737cc197a","Type":"ContainerDied","Data":"fe17720a957772033812087317d41d11249db3d725b5d188925fd05758c36efb"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.966871 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe17720a957772033812087317d41d11249db3d725b5d188925fd05758c36efb" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.966990 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-s7cvf" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.986051 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9025-account-create-update-nrk5c" event={"ID":"2e0102d8-4abf-499f-bfe8-149ace187639","Type":"ContainerDied","Data":"a94fc4bf601ea12e76e30436f728555ebed98dc809e97eecc9601d6d751f36d9"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.986104 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a94fc4bf601ea12e76e30436f728555ebed98dc809e97eecc9601d6d751f36d9" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.986227 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9025-account-create-update-nrk5c" Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.993339 4788 generic.go:334] "Generic (PLEG): container finished" podID="ef723535-d46b-46be-a561-cedf85829157" containerID="0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c" exitCode=0 Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.997777 4788 generic.go:334] "Generic (PLEG): container finished" podID="ef723535-d46b-46be-a561-cedf85829157" containerID="1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a" exitCode=2 Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.997815 4788 generic.go:334] "Generic (PLEG): container finished" podID="ef723535-d46b-46be-a561-cedf85829157" containerID="31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a" exitCode=0 Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.994060 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerDied","Data":"0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.997892 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerDied","Data":"1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a"} Feb 19 09:03:26 crc kubenswrapper[4788]: I0219 09:03:26.997934 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerDied","Data":"31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a"} Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.004202 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-combined-ca-bundle\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.004309 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data-custom\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.004459 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.004539 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksq6m\" (UniqueName: \"kubernetes.io/projected/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-kube-api-access-ksq6m\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.004741 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc9gc\" (UniqueName: \"kubernetes.io/projected/db2289eb-977d-42f1-a70a-772737cc197a-kube-api-access-gc9gc\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.004764 4788 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db2289eb-977d-42f1-a70a-772737cc197a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.045435 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-64cb7499df-fjp7p"] Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.048679 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.052096 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.063218 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-64cb7499df-fjp7p"] Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.078966 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-ttjbs"] Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.081549 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.090689 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-ttjbs"] Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106019 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data-custom\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106079 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106126 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106171 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksq6m\" (UniqueName: \"kubernetes.io/projected/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-kube-api-access-ksq6m\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106232 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jrq9\" (UniqueName: \"kubernetes.io/projected/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-kube-api-access-7jrq9\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106499 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106551 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg6w9\" (UniqueName: \"kubernetes.io/projected/163969c1-3ad0-4173-ac14-6ef793fa8f13-kube-api-access-kg6w9\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106589 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-config\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106632 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-combined-ca-bundle\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106660 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106689 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106717 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data-custom\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106753 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.106785 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-combined-ca-bundle\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.112999 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.113095 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-combined-ca-bundle\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.120289 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data-custom\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.143698 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksq6m\" (UniqueName: \"kubernetes.io/projected/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-kube-api-access-ksq6m\") pod \"heat-engine-5b7d4466df-8w62q\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.146322 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5599cffd79-mbvfb"] Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.147511 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.151355 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.168080 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5599cffd79-mbvfb"] Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.207843 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data-custom\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.209478 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.209637 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.209895 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jrq9\" (UniqueName: \"kubernetes.io/projected/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-kube-api-access-7jrq9\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.209941 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.210062 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg6w9\" (UniqueName: \"kubernetes.io/projected/163969c1-3ad0-4173-ac14-6ef793fa8f13-kube-api-access-kg6w9\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.210135 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-config\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.210328 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.210393 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.210428 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-combined-ca-bundle\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.210501 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.210860 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data-custom\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.212539 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-combined-ca-bundle\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.212771 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkfl7\" (UniqueName: \"kubernetes.io/projected/530985ec-183d-4519-a8d2-b0aff1bb87b3-kube-api-access-fkfl7\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.212350 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data-custom\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.213385 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.214352 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.214852 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.215123 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.215598 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-config\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.216942 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.218231 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.229718 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-combined-ca-bundle\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.230838 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jrq9\" (UniqueName: \"kubernetes.io/projected/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-kube-api-access-7jrq9\") pod \"heat-cfnapi-64cb7499df-fjp7p\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.232204 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg6w9\" (UniqueName: \"kubernetes.io/projected/163969c1-3ad0-4173-ac14-6ef793fa8f13-kube-api-access-kg6w9\") pod \"dnsmasq-dns-7756b9d78c-ttjbs\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.235200 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.244072 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.267628 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.314346 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-internal-tls-certs\") pod \"855949a4-e027-44b4-8705-202c74c3ffdb\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.314451 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-httpd-run\") pod \"855949a4-e027-44b4-8705-202c74c3ffdb\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.314504 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-config-data\") pod \"855949a4-e027-44b4-8705-202c74c3ffdb\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.314761 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wpbh\" (UniqueName: \"kubernetes.io/projected/855949a4-e027-44b4-8705-202c74c3ffdb-kube-api-access-7wpbh\") pod \"855949a4-e027-44b4-8705-202c74c3ffdb\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.314780 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-scripts\") pod \"855949a4-e027-44b4-8705-202c74c3ffdb\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.314846 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-combined-ca-bundle\") pod \"855949a4-e027-44b4-8705-202c74c3ffdb\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.314876 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-logs\") pod \"855949a4-e027-44b4-8705-202c74c3ffdb\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.314903 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"855949a4-e027-44b4-8705-202c74c3ffdb\" (UID: \"855949a4-e027-44b4-8705-202c74c3ffdb\") " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.315216 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-combined-ca-bundle\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.315268 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data-custom\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.315327 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkfl7\" (UniqueName: \"kubernetes.io/projected/530985ec-183d-4519-a8d2-b0aff1bb87b3-kube-api-access-fkfl7\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.315435 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.316070 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "855949a4-e027-44b4-8705-202c74c3ffdb" (UID: "855949a4-e027-44b4-8705-202c74c3ffdb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.317911 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-logs" (OuterVolumeSpecName: "logs") pod "855949a4-e027-44b4-8705-202c74c3ffdb" (UID: "855949a4-e027-44b4-8705-202c74c3ffdb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.330199 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data-custom\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.330365 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "855949a4-e027-44b4-8705-202c74c3ffdb" (UID: "855949a4-e027-44b4-8705-202c74c3ffdb"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.330735 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-scripts" (OuterVolumeSpecName: "scripts") pod "855949a4-e027-44b4-8705-202c74c3ffdb" (UID: "855949a4-e027-44b4-8705-202c74c3ffdb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.331530 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-combined-ca-bundle\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.331977 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.332139 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855949a4-e027-44b4-8705-202c74c3ffdb-kube-api-access-7wpbh" (OuterVolumeSpecName: "kube-api-access-7wpbh") pod "855949a4-e027-44b4-8705-202c74c3ffdb" (UID: "855949a4-e027-44b4-8705-202c74c3ffdb"). InnerVolumeSpecName "kube-api-access-7wpbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.356101 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkfl7\" (UniqueName: \"kubernetes.io/projected/530985ec-183d-4519-a8d2-b0aff1bb87b3-kube-api-access-fkfl7\") pod \"heat-api-5599cffd79-mbvfb\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.410641 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.418302 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.419240 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wpbh\" (UniqueName: \"kubernetes.io/projected/855949a4-e027-44b4-8705-202c74c3ffdb-kube-api-access-7wpbh\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.445104 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.445183 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.445552 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58576d599f-zjnz8" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.447368 4788 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.447397 4788 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/855949a4-e027-44b4-8705-202c74c3ffdb-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.425772 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "855949a4-e027-44b4-8705-202c74c3ffdb" (UID: "855949a4-e027-44b4-8705-202c74c3ffdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.425953 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "855949a4-e027-44b4-8705-202c74c3ffdb" (UID: "855949a4-e027-44b4-8705-202c74c3ffdb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.455157 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-config-data" (OuterVolumeSpecName: "config-data") pod "855949a4-e027-44b4-8705-202c74c3ffdb" (UID: "855949a4-e027-44b4-8705-202c74c3ffdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.531648 4788 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.552572 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.552607 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.552623 4788 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.552636 4788 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/855949a4-e027-44b4-8705-202c74c3ffdb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:27 crc kubenswrapper[4788]: I0219 09:03:27.597719 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.030979 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-64cb7499df-fjp7p"] Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.033647 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"855949a4-e027-44b4-8705-202c74c3ffdb","Type":"ContainerDied","Data":"16f2e88691f0e23b8e6b98c9f4ca5a54b0afac0165ff2c3be7ecec7b737c8e06"} Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.033698 4788 scope.go:117] "RemoveContainer" containerID="84ffdaa68b4ec177038b46443924be2446a2c21907fb89afa57c772f8ed8be25" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.033841 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.042780 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e55a0-6179-423a-8698-ae1f87b8c049","Type":"ContainerStarted","Data":"ebe9c50a23c3928e42e3ce5aeb0c7742db3164893af7df986bf4424973980ce3"} Feb 19 09:03:28 crc kubenswrapper[4788]: W0219 09:03:28.052939 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bca39c6_35b4_4c4b_b3a5_d632836fb00c.slice/crio-a36927855690e3e29440a41e29d34b432488be75e31cfda7da0cefd084b58de4 WatchSource:0}: Error finding container a36927855690e3e29440a41e29d34b432488be75e31cfda7da0cefd084b58de4: Status 404 returned error can't find the container with id a36927855690e3e29440a41e29d34b432488be75e31cfda7da0cefd084b58de4 Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.088969 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.107365 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.115860 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:03:28 crc kubenswrapper[4788]: E0219 09:03:28.116405 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" containerName="glance-log" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.116472 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" containerName="glance-log" Feb 19 09:03:28 crc kubenswrapper[4788]: E0219 09:03:28.116545 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" containerName="glance-httpd" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.116605 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" containerName="glance-httpd" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.116810 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" containerName="glance-httpd" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.116886 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" containerName="glance-log" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.118927 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.126536 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.126990 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.164319 4788 scope.go:117] "RemoveContainer" containerID="6946745df3ff4febdf38cfda55b437589b22c84ad790a5b24c86aa802cad8ddf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.211344 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.215394 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.224236 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxrtn\" (UniqueName: \"kubernetes.io/projected/b4abac97-c381-46dc-8451-35c8db80c9bd-kube-api-access-gxrtn\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.224339 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.224478 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.224534 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.224611 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4abac97-c381-46dc-8451-35c8db80c9bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.226707 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4abac97-c381-46dc-8451-35c8db80c9bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.226781 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.296219 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t48sf"] Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.313892 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t48sf"] Feb 19 09:03:28 crc kubenswrapper[4788]: W0219 09:03:28.316742 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod163969c1_3ad0_4173_ac14_6ef793fa8f13.slice/crio-6f9b3064b63d6ffe3478cb305072f83378331aa278d1b9136c6e24674db16629 WatchSource:0}: Error finding container 6f9b3064b63d6ffe3478cb305072f83378331aa278d1b9136c6e24674db16629: Status 404 returned error can't find the container with id 6f9b3064b63d6ffe3478cb305072f83378331aa278d1b9136c6e24674db16629 Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.317342 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.322660 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.322695 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zvvr6" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.327186 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.330115 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-ttjbs"] Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335170 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4abac97-c381-46dc-8451-35c8db80c9bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335229 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335362 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335385 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxrtn\" (UniqueName: \"kubernetes.io/projected/b4abac97-c381-46dc-8451-35c8db80c9bd-kube-api-access-gxrtn\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335408 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335476 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335505 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335528 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4abac97-c381-46dc-8451-35c8db80c9bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.335996 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4abac97-c381-46dc-8451-35c8db80c9bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.336205 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b4abac97-c381-46dc-8451-35c8db80c9bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.339300 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.341712 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.347715 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.349115 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5b7d4466df-8w62q"] Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.351259 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.355696 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4abac97-c381-46dc-8451-35c8db80c9bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.358184 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxrtn\" (UniqueName: \"kubernetes.io/projected/b4abac97-c381-46dc-8451-35c8db80c9bd-kube-api-access-gxrtn\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.433749 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b4abac97-c381-46dc-8451-35c8db80c9bd\") " pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.440394 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-config-data\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.440537 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.440642 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-scripts\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.440707 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjt2t\" (UniqueName: \"kubernetes.io/projected/dbd8890d-e06b-45e5-865b-838e036ac302-kube-api-access-wjt2t\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: E0219 09:03:28.495155 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855949a4_e027_44b4_8705_202c74c3ffdb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855949a4_e027_44b4_8705_202c74c3ffdb.slice/crio-16f2e88691f0e23b8e6b98c9f4ca5a54b0afac0165ff2c3be7ecec7b737c8e06\": RecentStats: unable to find data in memory cache]" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.517947 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.530933 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5599cffd79-mbvfb"] Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.542351 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-scripts\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.542448 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjt2t\" (UniqueName: \"kubernetes.io/projected/dbd8890d-e06b-45e5-865b-838e036ac302-kube-api-access-wjt2t\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.542517 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-config-data\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.542609 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.548135 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-scripts\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.556026 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-config-data\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.559252 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.561948 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjt2t\" (UniqueName: \"kubernetes.io/projected/dbd8890d-e06b-45e5-865b-838e036ac302-kube-api-access-wjt2t\") pod \"nova-cell0-conductor-db-sync-t48sf\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.647118 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:03:28 crc kubenswrapper[4788]: I0219 09:03:28.754328 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855949a4-e027-44b4-8705-202c74c3ffdb" path="/var/lib/kubelet/pods/855949a4-e027-44b4-8705-202c74c3ffdb/volumes" Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.092205 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" event={"ID":"5bca39c6-35b4-4c4b-b3a5-d632836fb00c","Type":"ContainerStarted","Data":"a36927855690e3e29440a41e29d34b432488be75e31cfda7da0cefd084b58de4"} Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.121452 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e55a0-6179-423a-8698-ae1f87b8c049","Type":"ContainerStarted","Data":"d5fedbed8a0c3a7c3b073e2dcef3cd042c4fa3a0ba9b096c753bf77777a04596"} Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.150002 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5599cffd79-mbvfb" event={"ID":"530985ec-183d-4519-a8d2-b0aff1bb87b3","Type":"ContainerStarted","Data":"edeb37d3b62355133cad50a0c7e519d3445afec9081d62c27daed0eb974b08a1"} Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.160402 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b7d4466df-8w62q" event={"ID":"c7b3ea5f-28c5-4aa3-ae05-94bce578741d","Type":"ContainerStarted","Data":"db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c"} Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.160471 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b7d4466df-8w62q" event={"ID":"c7b3ea5f-28c5-4aa3-ae05-94bce578741d","Type":"ContainerStarted","Data":"1570c84454328d5ef4c6b52e4987a7eebbf6a28df0a305905635215846480df3"} Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.160500 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.183414 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5b7d4466df-8w62q" podStartSLOduration=3.183384402 podStartE2EDuration="3.183384402s" podCreationTimestamp="2026-02-19 09:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:29.18131693 +0000 UTC m=+1111.169328402" watchObservedRunningTime="2026-02-19 09:03:29.183384402 +0000 UTC m=+1111.171395874" Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.192922 4788 generic.go:334] "Generic (PLEG): container finished" podID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerID="fb5056a71bee8bc419c6855dc853b73663ccd072c837967f551b8c30e64a2dd1" exitCode=0 Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.192974 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" event={"ID":"163969c1-3ad0-4173-ac14-6ef793fa8f13","Type":"ContainerDied","Data":"fb5056a71bee8bc419c6855dc853b73663ccd072c837967f551b8c30e64a2dd1"} Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.193002 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" event={"ID":"163969c1-3ad0-4173-ac14-6ef793fa8f13","Type":"ContainerStarted","Data":"6f9b3064b63d6ffe3478cb305072f83378331aa278d1b9136c6e24674db16629"} Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.407990 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.499643 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t48sf"] Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.845738 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.911546 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-sg-core-conf-yaml\") pod \"ef723535-d46b-46be-a561-cedf85829157\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.911915 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf9t9\" (UniqueName: \"kubernetes.io/projected/ef723535-d46b-46be-a561-cedf85829157-kube-api-access-zf9t9\") pod \"ef723535-d46b-46be-a561-cedf85829157\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.911941 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-run-httpd\") pod \"ef723535-d46b-46be-a561-cedf85829157\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.912018 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-log-httpd\") pod \"ef723535-d46b-46be-a561-cedf85829157\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.912111 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-scripts\") pod \"ef723535-d46b-46be-a561-cedf85829157\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.912149 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-combined-ca-bundle\") pod \"ef723535-d46b-46be-a561-cedf85829157\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.912168 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-config-data\") pod \"ef723535-d46b-46be-a561-cedf85829157\" (UID: \"ef723535-d46b-46be-a561-cedf85829157\") " Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.912825 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ef723535-d46b-46be-a561-cedf85829157" (UID: "ef723535-d46b-46be-a561-cedf85829157"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.914214 4788 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.917128 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ef723535-d46b-46be-a561-cedf85829157" (UID: "ef723535-d46b-46be-a561-cedf85829157"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.927946 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-scripts" (OuterVolumeSpecName: "scripts") pod "ef723535-d46b-46be-a561-cedf85829157" (UID: "ef723535-d46b-46be-a561-cedf85829157"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:29 crc kubenswrapper[4788]: I0219 09:03:29.935189 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef723535-d46b-46be-a561-cedf85829157-kube-api-access-zf9t9" (OuterVolumeSpecName: "kube-api-access-zf9t9") pod "ef723535-d46b-46be-a561-cedf85829157" (UID: "ef723535-d46b-46be-a561-cedf85829157"). InnerVolumeSpecName "kube-api-access-zf9t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.017337 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.017390 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf9t9\" (UniqueName: \"kubernetes.io/projected/ef723535-d46b-46be-a561-cedf85829157-kube-api-access-zf9t9\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.017415 4788 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ef723535-d46b-46be-a561-cedf85829157-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.027776 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ef723535-d46b-46be-a561-cedf85829157" (UID: "ef723535-d46b-46be-a561-cedf85829157"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.111029 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef723535-d46b-46be-a561-cedf85829157" (UID: "ef723535-d46b-46be-a561-cedf85829157"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.125515 4788 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.125563 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.184105 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-config-data" (OuterVolumeSpecName: "config-data") pod "ef723535-d46b-46be-a561-cedf85829157" (UID: "ef723535-d46b-46be-a561-cedf85829157"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.229923 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef723535-d46b-46be-a561-cedf85829157-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.233892 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.233987 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerDied","Data":"51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4"} Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.234050 4788 scope.go:117] "RemoveContainer" containerID="0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.248826 4788 generic.go:334] "Generic (PLEG): container finished" podID="ef723535-d46b-46be-a561-cedf85829157" containerID="51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4" exitCode=0 Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.248992 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ef723535-d46b-46be-a561-cedf85829157","Type":"ContainerDied","Data":"5dbfdc8e3bfc2aa620154322ebcedef771d1d62269bb7d96c1a50563f2b08382"} Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.263498 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t48sf" event={"ID":"dbd8890d-e06b-45e5-865b-838e036ac302","Type":"ContainerStarted","Data":"740aa7921bd9b4201c1a2df85f945ec2e5f55cba2a0f526691e4bdda3d0651f9"} Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.265202 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b4abac97-c381-46dc-8451-35c8db80c9bd","Type":"ContainerStarted","Data":"5c3c9cf30a605160493538b7c500911899561fb9b2b10028d892d87b5f3d4a2f"} Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.270951 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" event={"ID":"163969c1-3ad0-4173-ac14-6ef793fa8f13","Type":"ContainerStarted","Data":"6119ccfe1a249c33c4e565dd52aab75efb891b491fb1347d6ac5e1c7cb311cd5"} Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.271526 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.330341 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.333170 4788 scope.go:117] "RemoveContainer" containerID="1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.371093 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.391420 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:30 crc kubenswrapper[4788]: E0219 09:03:30.391864 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="ceilometer-notification-agent" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.391883 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="ceilometer-notification-agent" Feb 19 09:03:30 crc kubenswrapper[4788]: E0219 09:03:30.391898 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="sg-core" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.391905 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="sg-core" Feb 19 09:03:30 crc kubenswrapper[4788]: E0219 09:03:30.391922 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="ceilometer-central-agent" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.391932 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="ceilometer-central-agent" Feb 19 09:03:30 crc kubenswrapper[4788]: E0219 09:03:30.391953 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="proxy-httpd" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.391960 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="proxy-httpd" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.392156 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="proxy-httpd" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.392173 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="ceilometer-notification-agent" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.392188 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="sg-core" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.392201 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef723535-d46b-46be-a561-cedf85829157" containerName="ceilometer-central-agent" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.394007 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.396926 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.397137 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.401028 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" podStartSLOduration=4.401011536 podStartE2EDuration="4.401011536s" podCreationTimestamp="2026-02-19 09:03:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:30.337676095 +0000 UTC m=+1112.325687587" watchObservedRunningTime="2026-02-19 09:03:30.401011536 +0000 UTC m=+1112.389023008" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.420687 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.433474 4788 scope.go:117] "RemoveContainer" containerID="31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.538340 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.538741 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.538774 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-config-data\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.538817 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-scripts\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.538920 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.538946 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lfbd\" (UniqueName: \"kubernetes.io/projected/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-kube-api-access-5lfbd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.539004 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.547604 4788 scope.go:117] "RemoveContainer" containerID="51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.604857 4788 scope.go:117] "RemoveContainer" containerID="0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c" Feb 19 09:03:30 crc kubenswrapper[4788]: E0219 09:03:30.606585 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c\": container with ID starting with 0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c not found: ID does not exist" containerID="0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.606637 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c"} err="failed to get container status \"0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c\": rpc error: code = NotFound desc = could not find container \"0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c\": container with ID starting with 0862bd5e22cff6e1b74433ed87113c91b693e56ba86e1948972025c84991039c not found: ID does not exist" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.606670 4788 scope.go:117] "RemoveContainer" containerID="1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a" Feb 19 09:03:30 crc kubenswrapper[4788]: E0219 09:03:30.614864 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a\": container with ID starting with 1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a not found: ID does not exist" containerID="1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.614923 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a"} err="failed to get container status \"1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a\": rpc error: code = NotFound desc = could not find container \"1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a\": container with ID starting with 1a86076fd15bf769be54b8a05d65bebb25ccb0b51317462abda4669859a9972a not found: ID does not exist" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.614955 4788 scope.go:117] "RemoveContainer" containerID="31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a" Feb 19 09:03:30 crc kubenswrapper[4788]: E0219 09:03:30.635527 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a\": container with ID starting with 31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a not found: ID does not exist" containerID="31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.635583 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a"} err="failed to get container status \"31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a\": rpc error: code = NotFound desc = could not find container \"31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a\": container with ID starting with 31666832abfcdb10f03c2305be2ef19e8545d5bb3bf5e4c0204bc7126af87f4a not found: ID does not exist" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.635615 4788 scope.go:117] "RemoveContainer" containerID="51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4" Feb 19 09:03:30 crc kubenswrapper[4788]: E0219 09:03:30.638638 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4\": container with ID starting with 51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4 not found: ID does not exist" containerID="51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.638694 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4"} err="failed to get container status \"51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4\": rpc error: code = NotFound desc = could not find container \"51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4\": container with ID starting with 51557aef9e7d8dcc2687dad02360839c79bfc7f48095ef50b19198b5fc8109f4 not found: ID does not exist" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.640653 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.640716 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lfbd\" (UniqueName: \"kubernetes.io/projected/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-kube-api-access-5lfbd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.640782 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.640908 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.640956 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.640999 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-config-data\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.641037 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-scripts\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.641307 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-run-httpd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.641673 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-log-httpd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.651760 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-config-data\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.652511 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.655561 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-scripts\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.656735 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.687036 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lfbd\" (UniqueName: \"kubernetes.io/projected/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-kube-api-access-5lfbd\") pod \"ceilometer-0\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " pod="openstack/ceilometer-0" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.729496 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef723535-d46b-46be-a561-cedf85829157" path="/var/lib/kubelet/pods/ef723535-d46b-46be-a561-cedf85829157/volumes" Feb 19 09:03:30 crc kubenswrapper[4788]: I0219 09:03:30.736813 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:31 crc kubenswrapper[4788]: I0219 09:03:31.309897 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b4abac97-c381-46dc-8451-35c8db80c9bd","Type":"ContainerStarted","Data":"ac714bd39316847cc35bad36326b0f0f811c02ed66b38e37a4fe1c7ff53b8d18"} Feb 19 09:03:31 crc kubenswrapper[4788]: I0219 09:03:31.321908 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"850e55a0-6179-423a-8698-ae1f87b8c049","Type":"ContainerStarted","Data":"ae59bcbd24cba075cd845767859f36fc42d5f9cb0ea04836ba5a8f88b704f3c6"} Feb 19 09:03:31 crc kubenswrapper[4788]: I0219 09:03:31.403499 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.403474433 podStartE2EDuration="6.403474433s" podCreationTimestamp="2026-02-19 09:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:31.380210646 +0000 UTC m=+1113.368222138" watchObservedRunningTime="2026-02-19 09:03:31.403474433 +0000 UTC m=+1113.391485905" Feb 19 09:03:31 crc kubenswrapper[4788]: I0219 09:03:31.405473 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:32 crc kubenswrapper[4788]: I0219 09:03:32.335847 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b4abac97-c381-46dc-8451-35c8db80c9bd","Type":"ContainerStarted","Data":"555264d61e654d32b582b8ec01902cea7d867234a1c49b1f9735b2cb6e32e225"} Feb 19 09:03:32 crc kubenswrapper[4788]: I0219 09:03:32.339015 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerStarted","Data":"fc43e4adb5dcd226eb71dc18698879a4c1dbe9d28b0f2099116e46bbe9db16a8"} Feb 19 09:03:32 crc kubenswrapper[4788]: I0219 09:03:32.361568 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.361547228 podStartE2EDuration="4.361547228s" podCreationTimestamp="2026-02-19 09:03:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:32.35837859 +0000 UTC m=+1114.346390072" watchObservedRunningTime="2026-02-19 09:03:32.361547228 +0000 UTC m=+1114.349558700" Feb 19 09:03:32 crc kubenswrapper[4788]: I0219 09:03:32.895436 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 19 09:03:34 crc kubenswrapper[4788]: I0219 09:03:34.361387 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5599cffd79-mbvfb" event={"ID":"530985ec-183d-4519-a8d2-b0aff1bb87b3","Type":"ContainerStarted","Data":"2bbafe4feb95a5e642e8c9c24bb6387d0f5a1f87470854c667a3746347e71759"} Feb 19 09:03:34 crc kubenswrapper[4788]: I0219 09:03:34.362615 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:34 crc kubenswrapper[4788]: I0219 09:03:34.368330 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerStarted","Data":"fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733"} Feb 19 09:03:34 crc kubenswrapper[4788]: I0219 09:03:34.382147 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" event={"ID":"5bca39c6-35b4-4c4b-b3a5-d632836fb00c","Type":"ContainerStarted","Data":"ab2acc6732938dcf307008d2da05b7fb7e4d8a648977949a2a559f7b791dc4f9"} Feb 19 09:03:34 crc kubenswrapper[4788]: I0219 09:03:34.382292 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:34 crc kubenswrapper[4788]: I0219 09:03:34.386103 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5599cffd79-mbvfb" podStartSLOduration=3.256493816 podStartE2EDuration="8.386087177s" podCreationTimestamp="2026-02-19 09:03:26 +0000 UTC" firstStartedPulling="2026-02-19 09:03:28.616752827 +0000 UTC m=+1110.604764299" lastFinishedPulling="2026-02-19 09:03:33.746346188 +0000 UTC m=+1115.734357660" observedRunningTime="2026-02-19 09:03:34.382065938 +0000 UTC m=+1116.370077410" watchObservedRunningTime="2026-02-19 09:03:34.386087177 +0000 UTC m=+1116.374098649" Feb 19 09:03:34 crc kubenswrapper[4788]: I0219 09:03:34.410814 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" podStartSLOduration=2.726718676 podStartE2EDuration="8.410797911s" podCreationTimestamp="2026-02-19 09:03:26 +0000 UTC" firstStartedPulling="2026-02-19 09:03:28.057350501 +0000 UTC m=+1110.045361973" lastFinishedPulling="2026-02-19 09:03:33.741429736 +0000 UTC m=+1115.729441208" observedRunningTime="2026-02-19 09:03:34.407069328 +0000 UTC m=+1116.395080800" watchObservedRunningTime="2026-02-19 09:03:34.410797911 +0000 UTC m=+1116.398809383" Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.261439 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.401046 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerStarted","Data":"eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150"} Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.909203 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-69f6799bd7-ht4q2"] Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.910581 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.943411 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-69f6799bd7-ht4q2"] Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.969591 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6f9d44df-8wrgp"] Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.975911 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.982597 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-combined-ca-bundle\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.982713 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-config-data\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.982761 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwbg\" (UniqueName: \"kubernetes.io/projected/02915100-34f3-4f6d-945c-1417a4bd06f7-kube-api-access-fcwbg\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.982784 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-config-data-custom\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.984338 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-85d59d9b8-kg6zk"] Feb 19 09:03:35 crc kubenswrapper[4788]: I0219 09:03:35.985629 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.032798 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6f9d44df-8wrgp"] Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.043126 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-85d59d9b8-kg6zk"] Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.085235 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.085301 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkj5q\" (UniqueName: \"kubernetes.io/projected/3e03adfa-047d-4b46-937c-80d72bd604c5-kube-api-access-nkj5q\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.085334 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-config-data\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.085360 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwbg\" (UniqueName: \"kubernetes.io/projected/02915100-34f3-4f6d-945c-1417a4bd06f7-kube-api-access-fcwbg\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.085376 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-config-data-custom\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.085425 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data-custom\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.086204 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-combined-ca-bundle\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.086336 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gts8b\" (UniqueName: \"kubernetes.io/projected/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-kube-api-access-gts8b\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.086386 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-combined-ca-bundle\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.086573 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.086678 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-combined-ca-bundle\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.086712 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data-custom\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.091987 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-config-data\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.093931 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-config-data-custom\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.101072 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02915100-34f3-4f6d-945c-1417a4bd06f7-combined-ca-bundle\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.104861 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwbg\" (UniqueName: \"kubernetes.io/projected/02915100-34f3-4f6d-945c-1417a4bd06f7-kube-api-access-fcwbg\") pod \"heat-engine-69f6799bd7-ht4q2\" (UID: \"02915100-34f3-4f6d-945c-1417a4bd06f7\") " pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.188663 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data-custom\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.188738 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.188766 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkj5q\" (UniqueName: \"kubernetes.io/projected/3e03adfa-047d-4b46-937c-80d72bd604c5-kube-api-access-nkj5q\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.188809 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data-custom\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.188832 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-combined-ca-bundle\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.188862 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gts8b\" (UniqueName: \"kubernetes.io/projected/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-kube-api-access-gts8b\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.188878 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-combined-ca-bundle\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.188936 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.193695 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-combined-ca-bundle\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.196679 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data-custom\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.197293 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.197643 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data-custom\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.202187 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.204200 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-combined-ca-bundle\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.216755 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gts8b\" (UniqueName: \"kubernetes.io/projected/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-kube-api-access-gts8b\") pod \"heat-api-85d59d9b8-kg6zk\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.217033 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkj5q\" (UniqueName: \"kubernetes.io/projected/3e03adfa-047d-4b46-937c-80d72bd604c5-kube-api-access-nkj5q\") pod \"heat-cfnapi-6f9d44df-8wrgp\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.234910 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.274028 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.278374 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.303785 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.317539 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.318286 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.329722 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.408216 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 09:03:36 crc kubenswrapper[4788]: I0219 09:03:36.408280 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 19 09:03:37 crc kubenswrapper[4788]: I0219 09:03:37.270988 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:03:37 crc kubenswrapper[4788]: I0219 09:03:37.345330 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gmkv7"] Feb 19 09:03:37 crc kubenswrapper[4788]: I0219 09:03:37.345711 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" podUID="379f970e-625d-401e-b625-a81a8e19ec02" containerName="dnsmasq-dns" containerID="cri-o://502a0091c25fe8629946739276942cf44b25e6393de5a151cf8436937dd87611" gracePeriod=10 Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.447827 4788 generic.go:334] "Generic (PLEG): container finished" podID="379f970e-625d-401e-b625-a81a8e19ec02" containerID="502a0091c25fe8629946739276942cf44b25e6393de5a151cf8436937dd87611" exitCode=0 Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.447862 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" event={"ID":"379f970e-625d-401e-b625-a81a8e19ec02","Type":"ContainerDied","Data":"502a0091c25fe8629946739276942cf44b25e6393de5a151cf8436937dd87611"} Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.519483 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.519535 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.570749 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.577433 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.846451 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5599cffd79-mbvfb"] Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.846888 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5599cffd79-mbvfb" podUID="530985ec-183d-4519-a8d2-b0aff1bb87b3" containerName="heat-api" containerID="cri-o://2bbafe4feb95a5e642e8c9c24bb6387d0f5a1f87470854c667a3746347e71759" gracePeriod=60 Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.862849 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-64cb7499df-fjp7p"] Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.863305 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" podUID="5bca39c6-35b4-4c4b-b3a5-d632836fb00c" containerName="heat-cfnapi" containerID="cri-o://ab2acc6732938dcf307008d2da05b7fb7e4d8a648977949a2a559f7b791dc4f9" gracePeriod=60 Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.884766 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-746b766c9d-9j8dk"] Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.886608 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.889065 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.889352 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.913561 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6f874b587-dc7jr"] Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.916050 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.921835 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.922544 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.942847 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6f874b587-dc7jr"] Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.966492 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-config-data\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.966802 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-internal-tls-certs\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.966941 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-public-tls-certs\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.967126 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-combined-ca-bundle\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.967248 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-config-data-custom\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.967373 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqvkx\" (UniqueName: \"kubernetes.io/projected/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-kube-api-access-mqvkx\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:38 crc kubenswrapper[4788]: I0219 09:03:38.976510 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-746b766c9d-9j8dk"] Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.068917 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-config-data\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.068965 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-config-data-custom\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.068987 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-internal-tls-certs\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069008 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-internal-tls-certs\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069025 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-config-data\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069062 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-public-tls-certs\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069079 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-public-tls-certs\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069147 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7dd6\" (UniqueName: \"kubernetes.io/projected/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-kube-api-access-n7dd6\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069171 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-combined-ca-bundle\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069190 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-combined-ca-bundle\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069215 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-config-data-custom\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.069260 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqvkx\" (UniqueName: \"kubernetes.io/projected/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-kube-api-access-mqvkx\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.076915 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-public-tls-certs\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.077637 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-internal-tls-certs\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.078496 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-config-data\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.085474 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-combined-ca-bundle\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.094159 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-config-data-custom\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.101114 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqvkx\" (UniqueName: \"kubernetes.io/projected/b7b1972f-d2de-4154-a5fd-1b0adb9952a8-kube-api-access-mqvkx\") pod \"heat-api-746b766c9d-9j8dk\" (UID: \"b7b1972f-d2de-4154-a5fd-1b0adb9952a8\") " pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.171031 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-config-data-custom\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.171078 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-internal-tls-certs\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.171102 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-config-data\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.171142 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-public-tls-certs\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.171318 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dd6\" (UniqueName: \"kubernetes.io/projected/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-kube-api-access-n7dd6\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.171349 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-combined-ca-bundle\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.177768 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-combined-ca-bundle\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.178364 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-public-tls-certs\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.179808 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.179936 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.182034 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-config-data-custom\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.183091 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-config-data\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.183788 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-internal-tls-certs\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.193912 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7dd6\" (UniqueName: \"kubernetes.io/projected/d32302e3-6d30-4f9e-b993-e5fbaae1b9eb-kube-api-access-n7dd6\") pod \"heat-cfnapi-6f874b587-dc7jr\" (UID: \"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb\") " pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.199750 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.224634 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.250897 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.468491 4788 generic.go:334] "Generic (PLEG): container finished" podID="530985ec-183d-4519-a8d2-b0aff1bb87b3" containerID="2bbafe4feb95a5e642e8c9c24bb6387d0f5a1f87470854c667a3746347e71759" exitCode=0 Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.468564 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5599cffd79-mbvfb" event={"ID":"530985ec-183d-4519-a8d2-b0aff1bb87b3","Type":"ContainerDied","Data":"2bbafe4feb95a5e642e8c9c24bb6387d0f5a1f87470854c667a3746347e71759"} Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.487940 4788 generic.go:334] "Generic (PLEG): container finished" podID="5bca39c6-35b4-4c4b-b3a5-d632836fb00c" containerID="ab2acc6732938dcf307008d2da05b7fb7e4d8a648977949a2a559f7b791dc4f9" exitCode=0 Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.488130 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" event={"ID":"5bca39c6-35b4-4c4b-b3a5-d632836fb00c","Type":"ContainerDied","Data":"ab2acc6732938dcf307008d2da05b7fb7e4d8a648977949a2a559f7b791dc4f9"} Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.488710 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:39 crc kubenswrapper[4788]: I0219 09:03:39.488932 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:41 crc kubenswrapper[4788]: I0219 09:03:41.506807 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:03:41 crc kubenswrapper[4788]: I0219 09:03:41.507141 4788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 19 09:03:42 crc kubenswrapper[4788]: I0219 09:03:42.083216 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:42 crc kubenswrapper[4788]: I0219 09:03:42.225484 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" podUID="379f970e-625d-401e-b625-a81a8e19ec02" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.163:5353: connect: connection refused" Feb 19 09:03:42 crc kubenswrapper[4788]: I0219 09:03:42.247178 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" podUID="5bca39c6-35b4-4c4b-b3a5-d632836fb00c" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.181:8000/healthcheck\": dial tcp 10.217.0.181:8000: connect: connection refused" Feb 19 09:03:42 crc kubenswrapper[4788]: I0219 09:03:42.408587 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 19 09:03:42 crc kubenswrapper[4788]: I0219 09:03:42.602537 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5599cffd79-mbvfb" podUID="530985ec-183d-4519-a8d2-b0aff1bb87b3" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.183:8004/healthcheck\": dial tcp 10.217.0.183:8004: connect: connection refused" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.654105 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.669368 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.686964 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzds6\" (UniqueName: \"kubernetes.io/projected/379f970e-625d-401e-b625-a81a8e19ec02-kube-api-access-rzds6\") pod \"379f970e-625d-401e-b625-a81a8e19ec02\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687063 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-sb\") pod \"379f970e-625d-401e-b625-a81a8e19ec02\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687097 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-swift-storage-0\") pod \"379f970e-625d-401e-b625-a81a8e19ec02\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687182 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkfl7\" (UniqueName: \"kubernetes.io/projected/530985ec-183d-4519-a8d2-b0aff1bb87b3-kube-api-access-fkfl7\") pod \"530985ec-183d-4519-a8d2-b0aff1bb87b3\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687214 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-config\") pod \"379f970e-625d-401e-b625-a81a8e19ec02\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687285 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data\") pod \"530985ec-183d-4519-a8d2-b0aff1bb87b3\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687320 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-nb\") pod \"379f970e-625d-401e-b625-a81a8e19ec02\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687396 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-combined-ca-bundle\") pod \"530985ec-183d-4519-a8d2-b0aff1bb87b3\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687512 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-svc\") pod \"379f970e-625d-401e-b625-a81a8e19ec02\" (UID: \"379f970e-625d-401e-b625-a81a8e19ec02\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.687561 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data-custom\") pod \"530985ec-183d-4519-a8d2-b0aff1bb87b3\" (UID: \"530985ec-183d-4519-a8d2-b0aff1bb87b3\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.722861 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "530985ec-183d-4519-a8d2-b0aff1bb87b3" (UID: "530985ec-183d-4519-a8d2-b0aff1bb87b3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.725492 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379f970e-625d-401e-b625-a81a8e19ec02-kube-api-access-rzds6" (OuterVolumeSpecName: "kube-api-access-rzds6") pod "379f970e-625d-401e-b625-a81a8e19ec02" (UID: "379f970e-625d-401e-b625-a81a8e19ec02"). InnerVolumeSpecName "kube-api-access-rzds6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.737472 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/530985ec-183d-4519-a8d2-b0aff1bb87b3-kube-api-access-fkfl7" (OuterVolumeSpecName: "kube-api-access-fkfl7") pod "530985ec-183d-4519-a8d2-b0aff1bb87b3" (UID: "530985ec-183d-4519-a8d2-b0aff1bb87b3"). InnerVolumeSpecName "kube-api-access-fkfl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.806553 4788 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.806591 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzds6\" (UniqueName: \"kubernetes.io/projected/379f970e-625d-401e-b625-a81a8e19ec02-kube-api-access-rzds6\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.806603 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkfl7\" (UniqueName: \"kubernetes.io/projected/530985ec-183d-4519-a8d2-b0aff1bb87b3-kube-api-access-fkfl7\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.815063 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.826547 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-config" (OuterVolumeSpecName: "config") pod "379f970e-625d-401e-b625-a81a8e19ec02" (UID: "379f970e-625d-401e-b625-a81a8e19ec02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.871578 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "530985ec-183d-4519-a8d2-b0aff1bb87b3" (UID: "530985ec-183d-4519-a8d2-b0aff1bb87b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.909377 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-combined-ca-bundle\") pod \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.909422 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jrq9\" (UniqueName: \"kubernetes.io/projected/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-kube-api-access-7jrq9\") pod \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.909455 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data\") pod \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.909666 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data-custom\") pod \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\" (UID: \"5bca39c6-35b4-4c4b-b3a5-d632836fb00c\") " Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.910038 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.910059 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.936122 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5bca39c6-35b4-4c4b-b3a5-d632836fb00c" (UID: "5bca39c6-35b4-4c4b-b3a5-d632836fb00c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:44 crc kubenswrapper[4788]: I0219 09:03:44.941785 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-kube-api-access-7jrq9" (OuterVolumeSpecName: "kube-api-access-7jrq9") pod "5bca39c6-35b4-4c4b-b3a5-d632836fb00c" (UID: "5bca39c6-35b4-4c4b-b3a5-d632836fb00c"). InnerVolumeSpecName "kube-api-access-7jrq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.012093 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jrq9\" (UniqueName: \"kubernetes.io/projected/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-kube-api-access-7jrq9\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.012393 4788 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.034175 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "379f970e-625d-401e-b625-a81a8e19ec02" (UID: "379f970e-625d-401e-b625-a81a8e19ec02"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.055423 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "379f970e-625d-401e-b625-a81a8e19ec02" (UID: "379f970e-625d-401e-b625-a81a8e19ec02"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.075466 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data" (OuterVolumeSpecName: "config-data") pod "530985ec-183d-4519-a8d2-b0aff1bb87b3" (UID: "530985ec-183d-4519-a8d2-b0aff1bb87b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.079024 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "379f970e-625d-401e-b625-a81a8e19ec02" (UID: "379f970e-625d-401e-b625-a81a8e19ec02"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.084481 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "379f970e-625d-401e-b625-a81a8e19ec02" (UID: "379f970e-625d-401e-b625-a81a8e19ec02"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.101793 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bca39c6-35b4-4c4b-b3a5-d632836fb00c" (UID: "5bca39c6-35b4-4c4b-b3a5-d632836fb00c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.114322 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.114359 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/530985ec-183d-4519-a8d2-b0aff1bb87b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.114372 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.114386 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.114397 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.114515 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/379f970e-625d-401e-b625-a81a8e19ec02-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.122126 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data" (OuterVolumeSpecName: "config-data") pod "5bca39c6-35b4-4c4b-b3a5-d632836fb00c" (UID: "5bca39c6-35b4-4c4b-b3a5-d632836fb00c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.217279 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bca39c6-35b4-4c4b-b3a5-d632836fb00c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.286362 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6f9d44df-8wrgp"] Feb 19 09:03:45 crc kubenswrapper[4788]: W0219 09:03:45.298648 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02915100_34f3_4f6d_945c_1417a4bd06f7.slice/crio-518caa2ab4639f89f422b5ca1ea771f806580cb2675d129295c31f7e74a8cf1b WatchSource:0}: Error finding container 518caa2ab4639f89f422b5ca1ea771f806580cb2675d129295c31f7e74a8cf1b: Status 404 returned error can't find the container with id 518caa2ab4639f89f422b5ca1ea771f806580cb2675d129295c31f7e74a8cf1b Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.302292 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-69f6799bd7-ht4q2"] Feb 19 09:03:45 crc kubenswrapper[4788]: W0219 09:03:45.515779 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b1972f_d2de_4154_a5fd_1b0adb9952a8.slice/crio-5cba9fc72d995ae00cca10a2de529b698b5357253f968f587a46d988e39e52da WatchSource:0}: Error finding container 5cba9fc72d995ae00cca10a2de529b698b5357253f968f587a46d988e39e52da: Status 404 returned error can't find the container with id 5cba9fc72d995ae00cca10a2de529b698b5357253f968f587a46d988e39e52da Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.517831 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-746b766c9d-9j8dk"] Feb 19 09:03:45 crc kubenswrapper[4788]: W0219 09:03:45.519290 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd32302e3_6d30_4f9e_b993_e5fbaae1b9eb.slice/crio-86363edeb7200a18e5aaa30b4db7eda3dd1914c726b6be107b32091884ae82fa WatchSource:0}: Error finding container 86363edeb7200a18e5aaa30b4db7eda3dd1914c726b6be107b32091884ae82fa: Status 404 returned error can't find the container with id 86363edeb7200a18e5aaa30b4db7eda3dd1914c726b6be107b32091884ae82fa Feb 19 09:03:45 crc kubenswrapper[4788]: W0219 09:03:45.523749 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11aa21ba_9ef2_4aff_bfd1_4c4409ed5d4f.slice/crio-8834aeb25d3747fb90c15b992a182df85f721af3eb14d0cb1e8feef1a5150d5d WatchSource:0}: Error finding container 8834aeb25d3747fb90c15b992a182df85f721af3eb14d0cb1e8feef1a5150d5d: Status 404 returned error can't find the container with id 8834aeb25d3747fb90c15b992a182df85f721af3eb14d0cb1e8feef1a5150d5d Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.529911 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6f874b587-dc7jr"] Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.534703 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-85d59d9b8-kg6zk"] Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.554876 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t48sf" event={"ID":"dbd8890d-e06b-45e5-865b-838e036ac302","Type":"ContainerStarted","Data":"775500a2ee518a0719167f603a3bf25d996b96f34ab500016625d881ebfa0f19"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.563887 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" event={"ID":"3e03adfa-047d-4b46-937c-80d72bd604c5","Type":"ContainerStarted","Data":"527bb08b553b8bdb1a4c578c3239bb1a00a990eca2dd5a644aeab92611b0da3b"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.563926 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" event={"ID":"3e03adfa-047d-4b46-937c-80d72bd604c5","Type":"ContainerStarted","Data":"f5366bd7f67756adf54aed47dc242eb441c564f0754854ac4eaf4abf5221885a"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.564529 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.569237 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f874b587-dc7jr" event={"ID":"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb","Type":"ContainerStarted","Data":"86363edeb7200a18e5aaa30b4db7eda3dd1914c726b6be107b32091884ae82fa"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.578622 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-746b766c9d-9j8dk" event={"ID":"b7b1972f-d2de-4154-a5fd-1b0adb9952a8","Type":"ContainerStarted","Data":"5cba9fc72d995ae00cca10a2de529b698b5357253f968f587a46d988e39e52da"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.583275 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-t48sf" podStartSLOduration=2.278466566 podStartE2EDuration="17.583256429s" podCreationTimestamp="2026-02-19 09:03:28 +0000 UTC" firstStartedPulling="2026-02-19 09:03:29.506397534 +0000 UTC m=+1111.494409006" lastFinishedPulling="2026-02-19 09:03:44.811187397 +0000 UTC m=+1126.799198869" observedRunningTime="2026-02-19 09:03:45.572586074 +0000 UTC m=+1127.560597546" watchObservedRunningTime="2026-02-19 09:03:45.583256429 +0000 UTC m=+1127.571267901" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.585325 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerStarted","Data":"1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.587657 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-85d59d9b8-kg6zk" event={"ID":"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f","Type":"ContainerStarted","Data":"8834aeb25d3747fb90c15b992a182df85f721af3eb14d0cb1e8feef1a5150d5d"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.591818 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" podStartSLOduration=10.591801331 podStartE2EDuration="10.591801331s" podCreationTimestamp="2026-02-19 09:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:45.588399636 +0000 UTC m=+1127.576411108" watchObservedRunningTime="2026-02-19 09:03:45.591801331 +0000 UTC m=+1127.579812803" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.611167 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" event={"ID":"5bca39c6-35b4-4c4b-b3a5-d632836fb00c","Type":"ContainerDied","Data":"a36927855690e3e29440a41e29d34b432488be75e31cfda7da0cefd084b58de4"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.611226 4788 scope.go:117] "RemoveContainer" containerID="ab2acc6732938dcf307008d2da05b7fb7e4d8a648977949a2a559f7b791dc4f9" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.611399 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-64cb7499df-fjp7p" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.636525 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" event={"ID":"379f970e-625d-401e-b625-a81a8e19ec02","Type":"ContainerDied","Data":"1bf15969d8c0f72ba676d94388367ad28636561e19c17ec01f18672174dbd299"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.636645 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gmkv7" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.664286 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-64cb7499df-fjp7p"] Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.677898 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-64cb7499df-fjp7p"] Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.678281 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5599cffd79-mbvfb" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.687727 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5599cffd79-mbvfb" event={"ID":"530985ec-183d-4519-a8d2-b0aff1bb87b3","Type":"ContainerDied","Data":"edeb37d3b62355133cad50a0c7e519d3445afec9081d62c27daed0eb974b08a1"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.691548 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gmkv7"] Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.695283 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-69f6799bd7-ht4q2" event={"ID":"02915100-34f3-4f6d-945c-1417a4bd06f7","Type":"ContainerStarted","Data":"94a26baeb4429408c004491259b6f4aa7fe83c8ac54da3f4b3a2c5dabcbb2a16"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.695338 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-69f6799bd7-ht4q2" event={"ID":"02915100-34f3-4f6d-945c-1417a4bd06f7","Type":"ContainerStarted","Data":"518caa2ab4639f89f422b5ca1ea771f806580cb2675d129295c31f7e74a8cf1b"} Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.696190 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.703048 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gmkv7"] Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.710980 4788 scope.go:117] "RemoveContainer" containerID="502a0091c25fe8629946739276942cf44b25e6393de5a151cf8436937dd87611" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.750760 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-69f6799bd7-ht4q2" podStartSLOduration=10.750737373 podStartE2EDuration="10.750737373s" podCreationTimestamp="2026-02-19 09:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:45.717595661 +0000 UTC m=+1127.705607133" watchObservedRunningTime="2026-02-19 09:03:45.750737373 +0000 UTC m=+1127.738748835" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.769070 4788 scope.go:117] "RemoveContainer" containerID="0b4d41e04ca5eeaec965bde080d2231eca884954ed95a372a6cdb32b858d63bc" Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.799513 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5599cffd79-mbvfb"] Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.815165 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5599cffd79-mbvfb"] Feb 19 09:03:45 crc kubenswrapper[4788]: I0219 09:03:45.982525 4788 scope.go:117] "RemoveContainer" containerID="2bbafe4feb95a5e642e8c9c24bb6387d0f5a1f87470854c667a3746347e71759" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.708095 4788 generic.go:334] "Generic (PLEG): container finished" podID="3e03adfa-047d-4b46-937c-80d72bd604c5" containerID="527bb08b553b8bdb1a4c578c3239bb1a00a990eca2dd5a644aeab92611b0da3b" exitCode=1 Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.708268 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" event={"ID":"3e03adfa-047d-4b46-937c-80d72bd604c5","Type":"ContainerDied","Data":"527bb08b553b8bdb1a4c578c3239bb1a00a990eca2dd5a644aeab92611b0da3b"} Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.709007 4788 scope.go:117] "RemoveContainer" containerID="527bb08b553b8bdb1a4c578c3239bb1a00a990eca2dd5a644aeab92611b0da3b" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.728509 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379f970e-625d-401e-b625-a81a8e19ec02" path="/var/lib/kubelet/pods/379f970e-625d-401e-b625-a81a8e19ec02/volumes" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.729004 4788 generic.go:334] "Generic (PLEG): container finished" podID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" containerID="7b36b10197a0d5b17cafdbcc665820b7f388dd297fdebe705567e1ab67d47788" exitCode=1 Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.729272 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="530985ec-183d-4519-a8d2-b0aff1bb87b3" path="/var/lib/kubelet/pods/530985ec-183d-4519-a8d2-b0aff1bb87b3/volumes" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.729723 4788 scope.go:117] "RemoveContainer" containerID="7b36b10197a0d5b17cafdbcc665820b7f388dd297fdebe705567e1ab67d47788" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.731150 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bca39c6-35b4-4c4b-b3a5-d632836fb00c" path="/var/lib/kubelet/pods/5bca39c6-35b4-4c4b-b3a5-d632836fb00c/volumes" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.735869 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-746b766c9d-9j8dk" event={"ID":"b7b1972f-d2de-4154-a5fd-1b0adb9952a8","Type":"ContainerStarted","Data":"47bb87b33183c4147743d44e2764118c06f3af98534da2aead646de9d484be87"} Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.735942 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-85d59d9b8-kg6zk" event={"ID":"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f","Type":"ContainerDied","Data":"7b36b10197a0d5b17cafdbcc665820b7f388dd297fdebe705567e1ab67d47788"} Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.735965 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f874b587-dc7jr" event={"ID":"d32302e3-6d30-4f9e-b993-e5fbaae1b9eb","Type":"ContainerStarted","Data":"1ca1cbb4117e1823721f24c8f36db83670a21f03d8b00c314aeb411bdc5f7f8a"} Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.735989 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.736007 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.790049 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-746b766c9d-9j8dk" podStartSLOduration=8.790030833 podStartE2EDuration="8.790030833s" podCreationTimestamp="2026-02-19 09:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:46.755919467 +0000 UTC m=+1128.743930939" watchObservedRunningTime="2026-02-19 09:03:46.790030833 +0000 UTC m=+1128.778042305" Feb 19 09:03:46 crc kubenswrapper[4788]: I0219 09:03:46.828611 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6f874b587-dc7jr" podStartSLOduration=8.82859379 podStartE2EDuration="8.82859379s" podCreationTimestamp="2026-02-19 09:03:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:46.800895263 +0000 UTC m=+1128.788906735" watchObservedRunningTime="2026-02-19 09:03:46.82859379 +0000 UTC m=+1128.816605262" Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.382078 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.744591 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" event={"ID":"3e03adfa-047d-4b46-937c-80d72bd604c5","Type":"ContainerStarted","Data":"994807fc5b7795d2898b205329a36bd01ce0ad4eee6fa9cd235209b3d997f0a0"} Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.745188 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.748448 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerStarted","Data":"06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136"} Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.748540 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="ceilometer-central-agent" containerID="cri-o://fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733" gracePeriod=30 Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.748618 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="proxy-httpd" containerID="cri-o://06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136" gracePeriod=30 Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.748658 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="ceilometer-notification-agent" containerID="cri-o://eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150" gracePeriod=30 Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.748639 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="sg-core" containerID="cri-o://1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d" gracePeriod=30 Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.748626 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.754565 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-85d59d9b8-kg6zk" event={"ID":"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f","Type":"ContainerStarted","Data":"56589fde69177f5c1fe00a1abde75b592721538fabd9d2c28718b2dea6431585"} Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.754931 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.779885 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-85d59d9b8-kg6zk" podStartSLOduration=12.779861086 podStartE2EDuration="12.779861086s" podCreationTimestamp="2026-02-19 09:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:03:47.778378629 +0000 UTC m=+1129.766390101" watchObservedRunningTime="2026-02-19 09:03:47.779861086 +0000 UTC m=+1129.767872558" Feb 19 09:03:47 crc kubenswrapper[4788]: I0219 09:03:47.802607 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.375656678 podStartE2EDuration="17.80258954s" podCreationTimestamp="2026-02-19 09:03:30 +0000 UTC" firstStartedPulling="2026-02-19 09:03:31.434933933 +0000 UTC m=+1113.422945415" lastFinishedPulling="2026-02-19 09:03:46.861866805 +0000 UTC m=+1128.849878277" observedRunningTime="2026-02-19 09:03:47.799745559 +0000 UTC m=+1129.787757031" watchObservedRunningTime="2026-02-19 09:03:47.80258954 +0000 UTC m=+1129.790601002" Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.764541 4788 generic.go:334] "Generic (PLEG): container finished" podID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" containerID="56589fde69177f5c1fe00a1abde75b592721538fabd9d2c28718b2dea6431585" exitCode=1 Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.765157 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-85d59d9b8-kg6zk" event={"ID":"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f","Type":"ContainerDied","Data":"56589fde69177f5c1fe00a1abde75b592721538fabd9d2c28718b2dea6431585"} Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.765199 4788 scope.go:117] "RemoveContainer" containerID="7b36b10197a0d5b17cafdbcc665820b7f388dd297fdebe705567e1ab67d47788" Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.767187 4788 scope.go:117] "RemoveContainer" containerID="56589fde69177f5c1fe00a1abde75b592721538fabd9d2c28718b2dea6431585" Feb 19 09:03:48 crc kubenswrapper[4788]: E0219 09:03:48.767653 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-85d59d9b8-kg6zk_openstack(11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f)\"" pod="openstack/heat-api-85d59d9b8-kg6zk" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.770329 4788 generic.go:334] "Generic (PLEG): container finished" podID="3e03adfa-047d-4b46-937c-80d72bd604c5" containerID="994807fc5b7795d2898b205329a36bd01ce0ad4eee6fa9cd235209b3d997f0a0" exitCode=1 Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.770505 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" event={"ID":"3e03adfa-047d-4b46-937c-80d72bd604c5","Type":"ContainerDied","Data":"994807fc5b7795d2898b205329a36bd01ce0ad4eee6fa9cd235209b3d997f0a0"} Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.771628 4788 scope.go:117] "RemoveContainer" containerID="994807fc5b7795d2898b205329a36bd01ce0ad4eee6fa9cd235209b3d997f0a0" Feb 19 09:03:48 crc kubenswrapper[4788]: E0219 09:03:48.772051 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6f9d44df-8wrgp_openstack(3e03adfa-047d-4b46-937c-80d72bd604c5)\"" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.773400 4788 generic.go:334] "Generic (PLEG): container finished" podID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerID="06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136" exitCode=0 Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.773498 4788 generic.go:334] "Generic (PLEG): container finished" podID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerID="1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d" exitCode=2 Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.773572 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerDied","Data":"06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136"} Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.773663 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerDied","Data":"1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d"} Feb 19 09:03:48 crc kubenswrapper[4788]: I0219 09:03:48.834754 4788 scope.go:117] "RemoveContainer" containerID="527bb08b553b8bdb1a4c578c3239bb1a00a990eca2dd5a644aeab92611b0da3b" Feb 19 09:03:49 crc kubenswrapper[4788]: I0219 09:03:49.801893 4788 generic.go:334] "Generic (PLEG): container finished" podID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerID="eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150" exitCode=0 Feb 19 09:03:49 crc kubenswrapper[4788]: I0219 09:03:49.801998 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerDied","Data":"eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150"} Feb 19 09:03:49 crc kubenswrapper[4788]: I0219 09:03:49.806703 4788 scope.go:117] "RemoveContainer" containerID="56589fde69177f5c1fe00a1abde75b592721538fabd9d2c28718b2dea6431585" Feb 19 09:03:49 crc kubenswrapper[4788]: E0219 09:03:49.806940 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-85d59d9b8-kg6zk_openstack(11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f)\"" pod="openstack/heat-api-85d59d9b8-kg6zk" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" Feb 19 09:03:49 crc kubenswrapper[4788]: I0219 09:03:49.808519 4788 scope.go:117] "RemoveContainer" containerID="994807fc5b7795d2898b205329a36bd01ce0ad4eee6fa9cd235209b3d997f0a0" Feb 19 09:03:49 crc kubenswrapper[4788]: E0219 09:03:49.808755 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6f9d44df-8wrgp_openstack(3e03adfa-047d-4b46-937c-80d72bd604c5)\"" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.766425 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.819285 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.819238 4788 generic.go:334] "Generic (PLEG): container finished" podID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerID="fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733" exitCode=0 Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.819281 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerDied","Data":"fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733"} Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.819443 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a","Type":"ContainerDied","Data":"fc43e4adb5dcd226eb71dc18698879a4c1dbe9d28b0f2099116e46bbe9db16a8"} Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.819464 4788 scope.go:117] "RemoveContainer" containerID="06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.840780 4788 scope.go:117] "RemoveContainer" containerID="1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.852980 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-run-httpd\") pod \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.853104 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lfbd\" (UniqueName: \"kubernetes.io/projected/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-kube-api-access-5lfbd\") pod \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.853134 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-sg-core-conf-yaml\") pod \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.853181 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-scripts\") pod \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.853304 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-combined-ca-bundle\") pod \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.853335 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-config-data\") pod \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.853407 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-log-httpd\") pod \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\" (UID: \"bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a\") " Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.854694 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" (UID: "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.858572 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" (UID: "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.861569 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-scripts" (OuterVolumeSpecName: "scripts") pod "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" (UID: "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.863138 4788 scope.go:117] "RemoveContainer" containerID="eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.872652 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-kube-api-access-5lfbd" (OuterVolumeSpecName: "kube-api-access-5lfbd") pod "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" (UID: "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a"). InnerVolumeSpecName "kube-api-access-5lfbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.899126 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" (UID: "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.937445 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" (UID: "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.955858 4788 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.955910 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lfbd\" (UniqueName: \"kubernetes.io/projected/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-kube-api-access-5lfbd\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.955928 4788 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.955940 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.955952 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.955966 4788 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:50 crc kubenswrapper[4788]: I0219 09:03:50.967664 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-config-data" (OuterVolumeSpecName: "config-data") pod "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" (UID: "bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.006760 4788 scope.go:117] "RemoveContainer" containerID="fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.045116 4788 scope.go:117] "RemoveContainer" containerID="06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.045526 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136\": container with ID starting with 06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136 not found: ID does not exist" containerID="06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.045559 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136"} err="failed to get container status \"06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136\": rpc error: code = NotFound desc = could not find container \"06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136\": container with ID starting with 06a96f4934316428b9fa4bad74905b94742ffa274700ab07f093e0154119b136 not found: ID does not exist" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.045579 4788 scope.go:117] "RemoveContainer" containerID="1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.045991 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d\": container with ID starting with 1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d not found: ID does not exist" containerID="1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.046014 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d"} err="failed to get container status \"1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d\": rpc error: code = NotFound desc = could not find container \"1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d\": container with ID starting with 1ac80c823da83c04c63af6d8c1805e4f4ad93dcf00c054b739047cda0d19688d not found: ID does not exist" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.046027 4788 scope.go:117] "RemoveContainer" containerID="eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.046485 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150\": container with ID starting with eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150 not found: ID does not exist" containerID="eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.046509 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150"} err="failed to get container status \"eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150\": rpc error: code = NotFound desc = could not find container \"eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150\": container with ID starting with eb365de921b4af70d4c507bbd59305ea183980b8253dacd72eacd79d03e80150 not found: ID does not exist" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.046523 4788 scope.go:117] "RemoveContainer" containerID="fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.046876 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733\": container with ID starting with fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733 not found: ID does not exist" containerID="fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.046931 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733"} err="failed to get container status \"fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733\": rpc error: code = NotFound desc = could not find container \"fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733\": container with ID starting with fe8303d6c046d644da108bc2b558a605e1c45954deca4da53636b1a5e3b8c733 not found: ID does not exist" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.058568 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.153149 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.169981 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.198814 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.200472 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="ceilometer-notification-agent" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.200504 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="ceilometer-notification-agent" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.200538 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379f970e-625d-401e-b625-a81a8e19ec02" containerName="dnsmasq-dns" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.200547 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="379f970e-625d-401e-b625-a81a8e19ec02" containerName="dnsmasq-dns" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.200568 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379f970e-625d-401e-b625-a81a8e19ec02" containerName="init" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.200576 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="379f970e-625d-401e-b625-a81a8e19ec02" containerName="init" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.200610 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="530985ec-183d-4519-a8d2-b0aff1bb87b3" containerName="heat-api" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.200618 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="530985ec-183d-4519-a8d2-b0aff1bb87b3" containerName="heat-api" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.200652 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="proxy-httpd" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.200662 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="proxy-httpd" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.200676 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bca39c6-35b4-4c4b-b3a5-d632836fb00c" containerName="heat-cfnapi" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.200685 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bca39c6-35b4-4c4b-b3a5-d632836fb00c" containerName="heat-cfnapi" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.200693 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="ceilometer-central-agent" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.200719 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="ceilometer-central-agent" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.200743 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="sg-core" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.200750 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="sg-core" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.201231 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="ceilometer-central-agent" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.201267 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="530985ec-183d-4519-a8d2-b0aff1bb87b3" containerName="heat-api" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.201306 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="379f970e-625d-401e-b625-a81a8e19ec02" containerName="dnsmasq-dns" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.201329 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="sg-core" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.201354 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bca39c6-35b4-4c4b-b3a5-d632836fb00c" containerName="heat-cfnapi" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.201369 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="ceilometer-notification-agent" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.201379 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" containerName="proxy-httpd" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.208619 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.212073 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.212471 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.242431 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.304101 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.304834 4788 scope.go:117] "RemoveContainer" containerID="994807fc5b7795d2898b205329a36bd01ce0ad4eee6fa9cd235209b3d997f0a0" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.305119 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6f9d44df-8wrgp_openstack(3e03adfa-047d-4b46-937c-80d72bd604c5)\"" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.319570 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.320294 4788 scope.go:117] "RemoveContainer" containerID="56589fde69177f5c1fe00a1abde75b592721538fabd9d2c28718b2dea6431585" Feb 19 09:03:51 crc kubenswrapper[4788]: E0219 09:03:51.320511 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-85d59d9b8-kg6zk_openstack(11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f)\"" pod="openstack/heat-api-85d59d9b8-kg6zk" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.365791 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.365921 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-config-data\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.366056 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb8dg\" (UniqueName: \"kubernetes.io/projected/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-kube-api-access-tb8dg\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.366120 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-run-httpd\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.366141 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-log-httpd\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.366446 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-scripts\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.366516 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.468610 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb8dg\" (UniqueName: \"kubernetes.io/projected/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-kube-api-access-tb8dg\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.468725 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-run-httpd\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.468748 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-log-httpd\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.468860 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-scripts\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.468887 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.468982 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.469037 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-config-data\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.470879 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-run-httpd\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.471110 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-log-httpd\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.473410 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-config-data\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.476904 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-scripts\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.477003 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.477057 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.496161 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb8dg\" (UniqueName: \"kubernetes.io/projected/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-kube-api-access-tb8dg\") pod \"ceilometer-0\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " pod="openstack/ceilometer-0" Feb 19 09:03:51 crc kubenswrapper[4788]: I0219 09:03:51.546807 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:03:52 crc kubenswrapper[4788]: I0219 09:03:52.014540 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:52 crc kubenswrapper[4788]: I0219 09:03:52.139339 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:03:52 crc kubenswrapper[4788]: I0219 09:03:52.139416 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:03:52 crc kubenswrapper[4788]: I0219 09:03:52.525856 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:03:52 crc kubenswrapper[4788]: I0219 09:03:52.725636 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a" path="/var/lib/kubelet/pods/bb8c4a75-9d4d-49a0-b1db-26f5bfd6270a/volumes" Feb 19 09:03:52 crc kubenswrapper[4788]: I0219 09:03:52.840864 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerStarted","Data":"d1d5a1e1769baefff8ea6c4a6ca66358441f831b351968db5f3b401a1fa66fcf"} Feb 19 09:03:52 crc kubenswrapper[4788]: I0219 09:03:52.840917 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerStarted","Data":"8043cfa04b95e8e9eaefef688d9822959c42d9b0db400e878ba8283d053f6a92"} Feb 19 09:03:53 crc kubenswrapper[4788]: I0219 09:03:53.852275 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerStarted","Data":"c2acdb38a22b4421ec517a73398f462a5df06236b31665a50503e7c73569fbe9"} Feb 19 09:03:54 crc kubenswrapper[4788]: I0219 09:03:54.862495 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerStarted","Data":"a1fea8c70ea2af942b20f5a1943ce8cd02199112a9634e2db42227f4cfb5f5bc"} Feb 19 09:03:55 crc kubenswrapper[4788]: I0219 09:03:55.631825 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-746b766c9d-9j8dk" Feb 19 09:03:55 crc kubenswrapper[4788]: I0219 09:03:55.740770 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-85d59d9b8-kg6zk"] Feb 19 09:03:55 crc kubenswrapper[4788]: I0219 09:03:55.818750 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6f874b587-dc7jr" Feb 19 09:03:55 crc kubenswrapper[4788]: I0219 09:03:55.886433 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6f9d44df-8wrgp"] Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.164693 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.277498 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-69f6799bd7-ht4q2" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.294582 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-combined-ca-bundle\") pod \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.294922 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data-custom\") pod \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.295287 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data\") pod \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.295420 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gts8b\" (UniqueName: \"kubernetes.io/projected/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-kube-api-access-gts8b\") pod \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\" (UID: \"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f\") " Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.303797 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-kube-api-access-gts8b" (OuterVolumeSpecName: "kube-api-access-gts8b") pod "11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" (UID: "11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f"). InnerVolumeSpecName "kube-api-access-gts8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.304612 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" (UID: "11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.343168 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" (UID: "11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.389593 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5b7d4466df-8w62q"] Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.389805 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5b7d4466df-8w62q" podUID="c7b3ea5f-28c5-4aa3-ae05-94bce578741d" containerName="heat-engine" containerID="cri-o://db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" gracePeriod=60 Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.398774 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gts8b\" (UniqueName: \"kubernetes.io/projected/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-kube-api-access-gts8b\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.399271 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.399283 4788 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.402460 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data" (OuterVolumeSpecName: "config-data") pod "11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" (UID: "11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.412756 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.501171 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.602421 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data-custom\") pod \"3e03adfa-047d-4b46-937c-80d72bd604c5\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.602513 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkj5q\" (UniqueName: \"kubernetes.io/projected/3e03adfa-047d-4b46-937c-80d72bd604c5-kube-api-access-nkj5q\") pod \"3e03adfa-047d-4b46-937c-80d72bd604c5\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.602722 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-combined-ca-bundle\") pod \"3e03adfa-047d-4b46-937c-80d72bd604c5\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.602869 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data\") pod \"3e03adfa-047d-4b46-937c-80d72bd604c5\" (UID: \"3e03adfa-047d-4b46-937c-80d72bd604c5\") " Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.607198 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3e03adfa-047d-4b46-937c-80d72bd604c5" (UID: "3e03adfa-047d-4b46-937c-80d72bd604c5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.607379 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e03adfa-047d-4b46-937c-80d72bd604c5-kube-api-access-nkj5q" (OuterVolumeSpecName: "kube-api-access-nkj5q") pod "3e03adfa-047d-4b46-937c-80d72bd604c5" (UID: "3e03adfa-047d-4b46-937c-80d72bd604c5"). InnerVolumeSpecName "kube-api-access-nkj5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.628351 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e03adfa-047d-4b46-937c-80d72bd604c5" (UID: "3e03adfa-047d-4b46-937c-80d72bd604c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.647220 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data" (OuterVolumeSpecName: "config-data") pod "3e03adfa-047d-4b46-937c-80d72bd604c5" (UID: "3e03adfa-047d-4b46-937c-80d72bd604c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.704822 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.704859 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.704868 4788 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e03adfa-047d-4b46-937c-80d72bd604c5-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.704880 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkj5q\" (UniqueName: \"kubernetes.io/projected/3e03adfa-047d-4b46-937c-80d72bd604c5-kube-api-access-nkj5q\") on node \"crc\" DevicePath \"\"" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.886975 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="ceilometer-central-agent" containerID="cri-o://d1d5a1e1769baefff8ea6c4a6ca66358441f831b351968db5f3b401a1fa66fcf" gracePeriod=30 Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.887113 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="proxy-httpd" containerID="cri-o://986679b645b6aa54eddc48eb5a4484dfe0abf5f9a07c7070b17c0362e9399ae3" gracePeriod=30 Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.887164 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="sg-core" containerID="cri-o://a1fea8c70ea2af942b20f5a1943ce8cd02199112a9634e2db42227f4cfb5f5bc" gracePeriod=30 Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.887204 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="ceilometer-notification-agent" containerID="cri-o://c2acdb38a22b4421ec517a73398f462a5df06236b31665a50503e7c73569fbe9" gracePeriod=30 Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.887408 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerStarted","Data":"986679b645b6aa54eddc48eb5a4484dfe0abf5f9a07c7070b17c0362e9399ae3"} Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.887473 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.895484 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" event={"ID":"3e03adfa-047d-4b46-937c-80d72bd604c5","Type":"ContainerDied","Data":"f5366bd7f67756adf54aed47dc242eb441c564f0754854ac4eaf4abf5221885a"} Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.895542 4788 scope.go:117] "RemoveContainer" containerID="994807fc5b7795d2898b205329a36bd01ce0ad4eee6fa9cd235209b3d997f0a0" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.895543 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f9d44df-8wrgp" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.900029 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-85d59d9b8-kg6zk" event={"ID":"11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f","Type":"ContainerDied","Data":"8834aeb25d3747fb90c15b992a182df85f721af3eb14d0cb1e8feef1a5150d5d"} Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.900111 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-85d59d9b8-kg6zk" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.926692 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.528829767 podStartE2EDuration="5.926667768s" podCreationTimestamp="2026-02-19 09:03:51 +0000 UTC" firstStartedPulling="2026-02-19 09:03:52.013987025 +0000 UTC m=+1134.001998497" lastFinishedPulling="2026-02-19 09:03:56.411825026 +0000 UTC m=+1138.399836498" observedRunningTime="2026-02-19 09:03:56.909508422 +0000 UTC m=+1138.897519894" watchObservedRunningTime="2026-02-19 09:03:56.926667768 +0000 UTC m=+1138.914679240" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.938370 4788 scope.go:117] "RemoveContainer" containerID="56589fde69177f5c1fe00a1abde75b592721538fabd9d2c28718b2dea6431585" Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.960387 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-85d59d9b8-kg6zk"] Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.973034 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-85d59d9b8-kg6zk"] Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.983915 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6f9d44df-8wrgp"] Feb 19 09:03:56 crc kubenswrapper[4788]: I0219 09:03:56.991046 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6f9d44df-8wrgp"] Feb 19 09:03:57 crc kubenswrapper[4788]: E0219 09:03:57.238465 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 09:03:57 crc kubenswrapper[4788]: E0219 09:03:57.240107 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 09:03:57 crc kubenswrapper[4788]: E0219 09:03:57.242394 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 09:03:57 crc kubenswrapper[4788]: E0219 09:03:57.242443 4788 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5b7d4466df-8w62q" podUID="c7b3ea5f-28c5-4aa3-ae05-94bce578741d" containerName="heat-engine" Feb 19 09:03:57 crc kubenswrapper[4788]: I0219 09:03:57.911337 4788 generic.go:334] "Generic (PLEG): container finished" podID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerID="a1fea8c70ea2af942b20f5a1943ce8cd02199112a9634e2db42227f4cfb5f5bc" exitCode=2 Feb 19 09:03:57 crc kubenswrapper[4788]: I0219 09:03:57.911366 4788 generic.go:334] "Generic (PLEG): container finished" podID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerID="c2acdb38a22b4421ec517a73398f462a5df06236b31665a50503e7c73569fbe9" exitCode=0 Feb 19 09:03:57 crc kubenswrapper[4788]: I0219 09:03:57.911401 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerDied","Data":"a1fea8c70ea2af942b20f5a1943ce8cd02199112a9634e2db42227f4cfb5f5bc"} Feb 19 09:03:57 crc kubenswrapper[4788]: I0219 09:03:57.911423 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerDied","Data":"c2acdb38a22b4421ec517a73398f462a5df06236b31665a50503e7c73569fbe9"} Feb 19 09:03:58 crc kubenswrapper[4788]: I0219 09:03:58.731940 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" path="/var/lib/kubelet/pods/11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f/volumes" Feb 19 09:03:58 crc kubenswrapper[4788]: I0219 09:03:58.733044 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" path="/var/lib/kubelet/pods/3e03adfa-047d-4b46-937c-80d72bd604c5/volumes" Feb 19 09:04:07 crc kubenswrapper[4788]: E0219 09:04:07.238264 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 09:04:07 crc kubenswrapper[4788]: E0219 09:04:07.241152 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 09:04:07 crc kubenswrapper[4788]: E0219 09:04:07.242566 4788 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 19 09:04:07 crc kubenswrapper[4788]: E0219 09:04:07.242644 4788 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5b7d4466df-8w62q" podUID="c7b3ea5f-28c5-4aa3-ae05-94bce578741d" containerName="heat-engine" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.594467 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.757345 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data\") pod \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.758063 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data-custom\") pod \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.758107 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksq6m\" (UniqueName: \"kubernetes.io/projected/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-kube-api-access-ksq6m\") pod \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.758138 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-combined-ca-bundle\") pod \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\" (UID: \"c7b3ea5f-28c5-4aa3-ae05-94bce578741d\") " Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.763431 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c7b3ea5f-28c5-4aa3-ae05-94bce578741d" (UID: "c7b3ea5f-28c5-4aa3-ae05-94bce578741d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.763481 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-kube-api-access-ksq6m" (OuterVolumeSpecName: "kube-api-access-ksq6m") pod "c7b3ea5f-28c5-4aa3-ae05-94bce578741d" (UID: "c7b3ea5f-28c5-4aa3-ae05-94bce578741d"). InnerVolumeSpecName "kube-api-access-ksq6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.789324 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7b3ea5f-28c5-4aa3-ae05-94bce578741d" (UID: "c7b3ea5f-28c5-4aa3-ae05-94bce578741d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.812576 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data" (OuterVolumeSpecName: "config-data") pod "c7b3ea5f-28c5-4aa3-ae05-94bce578741d" (UID: "c7b3ea5f-28c5-4aa3-ae05-94bce578741d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.860392 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.860427 4788 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.860442 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksq6m\" (UniqueName: \"kubernetes.io/projected/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-kube-api-access-ksq6m\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:08 crc kubenswrapper[4788]: I0219 09:04:08.860453 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b3ea5f-28c5-4aa3-ae05-94bce578741d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.018853 4788 generic.go:334] "Generic (PLEG): container finished" podID="c7b3ea5f-28c5-4aa3-ae05-94bce578741d" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" exitCode=0 Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.018921 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b7d4466df-8w62q" Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.018936 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b7d4466df-8w62q" event={"ID":"c7b3ea5f-28c5-4aa3-ae05-94bce578741d","Type":"ContainerDied","Data":"db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c"} Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.019729 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b7d4466df-8w62q" event={"ID":"c7b3ea5f-28c5-4aa3-ae05-94bce578741d","Type":"ContainerDied","Data":"1570c84454328d5ef4c6b52e4987a7eebbf6a28df0a305905635215846480df3"} Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.019758 4788 scope.go:117] "RemoveContainer" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.045571 4788 scope.go:117] "RemoveContainer" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" Feb 19 09:04:09 crc kubenswrapper[4788]: E0219 09:04:09.046352 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c\": container with ID starting with db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c not found: ID does not exist" containerID="db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c" Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.046384 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c"} err="failed to get container status \"db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c\": rpc error: code = NotFound desc = could not find container \"db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c\": container with ID starting with db38269184292f014222bd2778f5135bd596e9d6f5cf0c679cf0c74525695e4c not found: ID does not exist" Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.080084 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5b7d4466df-8w62q"] Feb 19 09:04:09 crc kubenswrapper[4788]: I0219 09:04:09.091614 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5b7d4466df-8w62q"] Feb 19 09:04:10 crc kubenswrapper[4788]: I0219 09:04:10.724810 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b3ea5f-28c5-4aa3-ae05-94bce578741d" path="/var/lib/kubelet/pods/c7b3ea5f-28c5-4aa3-ae05-94bce578741d/volumes" Feb 19 09:04:11 crc kubenswrapper[4788]: I0219 09:04:11.040565 4788 generic.go:334] "Generic (PLEG): container finished" podID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerID="d1d5a1e1769baefff8ea6c4a6ca66358441f831b351968db5f3b401a1fa66fcf" exitCode=0 Feb 19 09:04:11 crc kubenswrapper[4788]: I0219 09:04:11.040607 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerDied","Data":"d1d5a1e1769baefff8ea6c4a6ca66358441f831b351968db5f3b401a1fa66fcf"} Feb 19 09:04:13 crc kubenswrapper[4788]: I0219 09:04:13.060630 4788 generic.go:334] "Generic (PLEG): container finished" podID="dbd8890d-e06b-45e5-865b-838e036ac302" containerID="775500a2ee518a0719167f603a3bf25d996b96f34ab500016625d881ebfa0f19" exitCode=0 Feb 19 09:04:13 crc kubenswrapper[4788]: I0219 09:04:13.060791 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t48sf" event={"ID":"dbd8890d-e06b-45e5-865b-838e036ac302","Type":"ContainerDied","Data":"775500a2ee518a0719167f603a3bf25d996b96f34ab500016625d881ebfa0f19"} Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.537112 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.701977 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-combined-ca-bundle\") pod \"dbd8890d-e06b-45e5-865b-838e036ac302\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.702089 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-config-data\") pod \"dbd8890d-e06b-45e5-865b-838e036ac302\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.702208 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-scripts\") pod \"dbd8890d-e06b-45e5-865b-838e036ac302\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.702329 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjt2t\" (UniqueName: \"kubernetes.io/projected/dbd8890d-e06b-45e5-865b-838e036ac302-kube-api-access-wjt2t\") pod \"dbd8890d-e06b-45e5-865b-838e036ac302\" (UID: \"dbd8890d-e06b-45e5-865b-838e036ac302\") " Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.707952 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd8890d-e06b-45e5-865b-838e036ac302-kube-api-access-wjt2t" (OuterVolumeSpecName: "kube-api-access-wjt2t") pod "dbd8890d-e06b-45e5-865b-838e036ac302" (UID: "dbd8890d-e06b-45e5-865b-838e036ac302"). InnerVolumeSpecName "kube-api-access-wjt2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.709044 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-scripts" (OuterVolumeSpecName: "scripts") pod "dbd8890d-e06b-45e5-865b-838e036ac302" (UID: "dbd8890d-e06b-45e5-865b-838e036ac302"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.728669 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbd8890d-e06b-45e5-865b-838e036ac302" (UID: "dbd8890d-e06b-45e5-865b-838e036ac302"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.730779 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-config-data" (OuterVolumeSpecName: "config-data") pod "dbd8890d-e06b-45e5-865b-838e036ac302" (UID: "dbd8890d-e06b-45e5-865b-838e036ac302"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.805597 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.806122 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjt2t\" (UniqueName: \"kubernetes.io/projected/dbd8890d-e06b-45e5-865b-838e036ac302-kube-api-access-wjt2t\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.806144 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:14 crc kubenswrapper[4788]: I0219 09:04:14.806155 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd8890d-e06b-45e5-865b-838e036ac302-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.080621 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t48sf" event={"ID":"dbd8890d-e06b-45e5-865b-838e036ac302","Type":"ContainerDied","Data":"740aa7921bd9b4201c1a2df85f945ec2e5f55cba2a0f526691e4bdda3d0651f9"} Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.080663 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="740aa7921bd9b4201c1a2df85f945ec2e5f55cba2a0f526691e4bdda3d0651f9" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.080725 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t48sf" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.227327 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 09:04:15 crc kubenswrapper[4788]: E0219 09:04:15.228020 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b3ea5f-28c5-4aa3-ae05-94bce578741d" containerName="heat-engine" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.228126 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b3ea5f-28c5-4aa3-ae05-94bce578741d" containerName="heat-engine" Feb 19 09:04:15 crc kubenswrapper[4788]: E0219 09:04:15.228259 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" containerName="heat-api" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.228350 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" containerName="heat-api" Feb 19 09:04:15 crc kubenswrapper[4788]: E0219 09:04:15.228448 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" containerName="heat-cfnapi" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.228514 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" containerName="heat-cfnapi" Feb 19 09:04:15 crc kubenswrapper[4788]: E0219 09:04:15.228587 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" containerName="heat-api" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.228659 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" containerName="heat-api" Feb 19 09:04:15 crc kubenswrapper[4788]: E0219 09:04:15.228759 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd8890d-e06b-45e5-865b-838e036ac302" containerName="nova-cell0-conductor-db-sync" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.228813 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd8890d-e06b-45e5-865b-838e036ac302" containerName="nova-cell0-conductor-db-sync" Feb 19 09:04:15 crc kubenswrapper[4788]: E0219 09:04:15.228865 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" containerName="heat-cfnapi" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.229019 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" containerName="heat-cfnapi" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.229286 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" containerName="heat-cfnapi" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.229356 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" containerName="heat-api" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.229420 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="11aa21ba-9ef2-4aff-bfd1-4c4409ed5d4f" containerName="heat-api" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.229490 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd8890d-e06b-45e5-865b-838e036ac302" containerName="nova-cell0-conductor-db-sync" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.229545 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b3ea5f-28c5-4aa3-ae05-94bce578741d" containerName="heat-engine" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.230305 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.232722 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-zvvr6" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.233180 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.238623 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.314382 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8sj\" (UniqueName: \"kubernetes.io/projected/4c696528-c586-4ecc-8788-df43d6d03193-kube-api-access-4k8sj\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.314495 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c696528-c586-4ecc-8788-df43d6d03193-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.314524 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c696528-c586-4ecc-8788-df43d6d03193-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.415371 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8sj\" (UniqueName: \"kubernetes.io/projected/4c696528-c586-4ecc-8788-df43d6d03193-kube-api-access-4k8sj\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.415462 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c696528-c586-4ecc-8788-df43d6d03193-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.415484 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c696528-c586-4ecc-8788-df43d6d03193-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.420811 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c696528-c586-4ecc-8788-df43d6d03193-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.430750 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c696528-c586-4ecc-8788-df43d6d03193-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.435597 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8sj\" (UniqueName: \"kubernetes.io/projected/4c696528-c586-4ecc-8788-df43d6d03193-kube-api-access-4k8sj\") pod \"nova-cell0-conductor-0\" (UID: \"4c696528-c586-4ecc-8788-df43d6d03193\") " pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:15 crc kubenswrapper[4788]: I0219 09:04:15.548514 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:16 crc kubenswrapper[4788]: I0219 09:04:16.036589 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 19 09:04:16 crc kubenswrapper[4788]: W0219 09:04:16.037184 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c696528_c586_4ecc_8788_df43d6d03193.slice/crio-98eea6eb99ca5caadabe2981c72693f73888c628a88210826c5e9b9ad9c46099 WatchSource:0}: Error finding container 98eea6eb99ca5caadabe2981c72693f73888c628a88210826c5e9b9ad9c46099: Status 404 returned error can't find the container with id 98eea6eb99ca5caadabe2981c72693f73888c628a88210826c5e9b9ad9c46099 Feb 19 09:04:16 crc kubenswrapper[4788]: I0219 09:04:16.100670 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4c696528-c586-4ecc-8788-df43d6d03193","Type":"ContainerStarted","Data":"98eea6eb99ca5caadabe2981c72693f73888c628a88210826c5e9b9ad9c46099"} Feb 19 09:04:17 crc kubenswrapper[4788]: I0219 09:04:17.112117 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4c696528-c586-4ecc-8788-df43d6d03193","Type":"ContainerStarted","Data":"c494e512f3f50906ea56044816b456d4730feb9f9e58ca5146ce3c48f7f735d7"} Feb 19 09:04:17 crc kubenswrapper[4788]: I0219 09:04:17.112876 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:17 crc kubenswrapper[4788]: I0219 09:04:17.136398 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.136355938 podStartE2EDuration="2.136355938s" podCreationTimestamp="2026-02-19 09:04:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:17.125303464 +0000 UTC m=+1159.113314926" watchObservedRunningTime="2026-02-19 09:04:17.136355938 +0000 UTC m=+1159.124367410" Feb 19 09:04:21 crc kubenswrapper[4788]: I0219 09:04:21.551319 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 19 09:04:22 crc kubenswrapper[4788]: I0219 09:04:22.139797 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:04:22 crc kubenswrapper[4788]: I0219 09:04:22.140223 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:04:22 crc kubenswrapper[4788]: I0219 09:04:22.140417 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:04:22 crc kubenswrapper[4788]: I0219 09:04:22.141345 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31cfa590dbe60cb7189f587f667407f74d6387f19ad0205b2e674711ceebc406"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:04:22 crc kubenswrapper[4788]: I0219 09:04:22.141526 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://31cfa590dbe60cb7189f587f667407f74d6387f19ad0205b2e674711ceebc406" gracePeriod=600 Feb 19 09:04:23 crc kubenswrapper[4788]: I0219 09:04:23.172941 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="31cfa590dbe60cb7189f587f667407f74d6387f19ad0205b2e674711ceebc406" exitCode=0 Feb 19 09:04:23 crc kubenswrapper[4788]: I0219 09:04:23.173016 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"31cfa590dbe60cb7189f587f667407f74d6387f19ad0205b2e674711ceebc406"} Feb 19 09:04:23 crc kubenswrapper[4788]: I0219 09:04:23.174640 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"defbc637313174f365a0e7e0457f9fbe5da4bafbab9b16543dceddb4ed84fa1d"} Feb 19 09:04:23 crc kubenswrapper[4788]: I0219 09:04:23.174695 4788 scope.go:117] "RemoveContainer" containerID="d5bb01dc9098ceeecfb8b55d79c5464f45f8f2b74f74a77633116b07488417cd" Feb 19 09:04:25 crc kubenswrapper[4788]: I0219 09:04:25.573941 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.020399 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-hwbvj"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.021442 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e03adfa-047d-4b46-937c-80d72bd604c5" containerName="heat-cfnapi" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.022202 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.025655 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.025735 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.031482 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hwbvj"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.122860 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.122922 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-scripts\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.123172 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8zkm\" (UniqueName: \"kubernetes.io/projected/2b48c414-8fa5-4654-b4c6-457650a816b4-kube-api-access-w8zkm\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.123306 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-config-data\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.225123 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8zkm\" (UniqueName: \"kubernetes.io/projected/2b48c414-8fa5-4654-b4c6-457650a816b4-kube-api-access-w8zkm\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.225233 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-config-data\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.225394 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.225508 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-scripts\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.235800 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.237348 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-config-data\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.260381 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-scripts\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.273151 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8zkm\" (UniqueName: \"kubernetes.io/projected/2b48c414-8fa5-4654-b4c6-457650a816b4-kube-api-access-w8zkm\") pod \"nova-cell0-cell-mapping-hwbvj\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.278874 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.279990 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.283322 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.311091 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.327107 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.327205 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-config-data\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.327353 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdld2\" (UniqueName: \"kubernetes.io/projected/a7cd957e-8518-477b-bd07-02904043190a-kube-api-access-wdld2\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.341972 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.429219 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.429537 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-config-data\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.429658 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdld2\" (UniqueName: \"kubernetes.io/projected/a7cd957e-8518-477b-bd07-02904043190a-kube-api-access-wdld2\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.441800 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-config-data\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.445630 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.459336 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.461306 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.478751 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdld2\" (UniqueName: \"kubernetes.io/projected/a7cd957e-8518-477b-bd07-02904043190a-kube-api-access-wdld2\") pod \"nova-scheduler-0\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.479065 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.479183 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.560334 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.561937 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.566548 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.568980 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.634852 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-config-data\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.634906 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.634971 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-logs\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.635059 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzck\" (UniqueName: \"kubernetes.io/projected/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-kube-api-access-dpzck\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.652573 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.654028 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.658461 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.676763 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.694868 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.737144 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-config-data\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.737194 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.737257 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-config-data\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.737284 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.737321 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21597275-5627-4c64-a397-b0477a0ff5eb-logs\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.737368 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-logs\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.737457 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzck\" (UniqueName: \"kubernetes.io/projected/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-kube-api-access-dpzck\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.737584 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twfkk\" (UniqueName: \"kubernetes.io/projected/21597275-5627-4c64-a397-b0477a0ff5eb-kube-api-access-twfkk\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.741554 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-logs\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.748809 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-config-data\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.758891 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.760197 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-8z8l9"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.761606 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.766562 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzck\" (UniqueName: \"kubernetes.io/projected/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-kube-api-access-dpzck\") pod \"nova-api-0\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.826340 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-8z8l9"] Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.841252 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twfkk\" (UniqueName: \"kubernetes.io/projected/21597275-5627-4c64-a397-b0477a0ff5eb-kube-api-access-twfkk\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.841321 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-config-data\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.841338 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.841366 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.841391 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21597275-5627-4c64-a397-b0477a0ff5eb-logs\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.841407 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll6d4\" (UniqueName: \"kubernetes.io/projected/f82971ff-0ab5-4ae4-8de7-73159394c022-kube-api-access-ll6d4\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.841484 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.846605 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21597275-5627-4c64-a397-b0477a0ff5eb-logs\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.848843 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.850056 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-config-data\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.862815 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twfkk\" (UniqueName: \"kubernetes.io/projected/21597275-5627-4c64-a397-b0477a0ff5eb-kube-api-access-twfkk\") pod \"nova-metadata-0\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.914836 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.931800 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945298 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-svc\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945375 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945454 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945479 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945504 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945524 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvvtr\" (UniqueName: \"kubernetes.io/projected/e055219d-2144-4750-8255-9bc573b74163-kube-api-access-hvvtr\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945543 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-config\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945609 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.945638 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6d4\" (UniqueName: \"kubernetes.io/projected/f82971ff-0ab5-4ae4-8de7-73159394c022-kube-api-access-ll6d4\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.952719 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.953137 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:26 crc kubenswrapper[4788]: I0219 09:04:26.971059 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll6d4\" (UniqueName: \"kubernetes.io/projected/f82971ff-0ab5-4ae4-8de7-73159394c022-kube-api-access-ll6d4\") pod \"nova-cell1-novncproxy-0\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.022973 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.047180 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-svc\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.047251 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.047318 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.047346 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.047365 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvvtr\" (UniqueName: \"kubernetes.io/projected/e055219d-2144-4750-8255-9bc573b74163-kube-api-access-hvvtr\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.047384 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-config\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.048423 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-config\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.049039 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-svc\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.049712 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.050364 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.051035 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.068407 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvvtr\" (UniqueName: \"kubernetes.io/projected/e055219d-2144-4750-8255-9bc573b74163-kube-api-access-hvvtr\") pod \"dnsmasq-dns-9b86998b5-8z8l9\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.085461 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.118138 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hwbvj"] Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.259456 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.280201 4788 generic.go:334] "Generic (PLEG): container finished" podID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerID="986679b645b6aa54eddc48eb5a4484dfe0abf5f9a07c7070b17c0362e9399ae3" exitCode=137 Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.280400 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerDied","Data":"986679b645b6aa54eddc48eb5a4484dfe0abf5f9a07c7070b17c0362e9399ae3"} Feb 19 09:04:27 crc kubenswrapper[4788]: W0219 09:04:27.282562 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7cd957e_8518_477b_bd07_02904043190a.slice/crio-4cd7f61967aad3369b09baaabef4cdb965241636b16fdaca6910dcda8bc373c0 WatchSource:0}: Error finding container 4cd7f61967aad3369b09baaabef4cdb965241636b16fdaca6910dcda8bc373c0: Status 404 returned error can't find the container with id 4cd7f61967aad3369b09baaabef4cdb965241636b16fdaca6910dcda8bc373c0 Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.285822 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hwbvj" event={"ID":"2b48c414-8fa5-4654-b4c6-457650a816b4","Type":"ContainerStarted","Data":"05a75356ff6eec55c72a7e9af29b70357ffa3d6d6da3f440afc565cc23be83e0"} Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.453399 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xcz8"] Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.454803 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.458691 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.458877 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.465020 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xcz8"] Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.562590 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-scripts\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.562914 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsm79\" (UniqueName: \"kubernetes.io/projected/1b5fef27-0741-4f5a-9a12-fa6917cf16af-kube-api-access-dsm79\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.563239 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.563446 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-config-data\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.597080 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.637260 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.666336 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsm79\" (UniqueName: \"kubernetes.io/projected/1b5fef27-0741-4f5a-9a12-fa6917cf16af-kube-api-access-dsm79\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.666408 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.666449 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-config-data\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.666491 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-scripts\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.673686 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-config-data\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.673704 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.676869 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-scripts\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.685122 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsm79\" (UniqueName: \"kubernetes.io/projected/1b5fef27-0741-4f5a-9a12-fa6917cf16af-kube-api-access-dsm79\") pod \"nova-cell1-conductor-db-sync-9xcz8\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: W0219 09:04:27.751165 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode055219d_2144_4750_8255_9bc573b74163.slice/crio-864f3c952385a8b96978e7fb03fb8e947385c5e648bc7050a81c3877837aaea2 WatchSource:0}: Error finding container 864f3c952385a8b96978e7fb03fb8e947385c5e648bc7050a81c3877837aaea2: Status 404 returned error can't find the container with id 864f3c952385a8b96978e7fb03fb8e947385c5e648bc7050a81c3877837aaea2 Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.758610 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-8z8l9"] Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.771762 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:27 crc kubenswrapper[4788]: I0219 09:04:27.773282 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.034238 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.179901 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-scripts\") pod \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.179962 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb8dg\" (UniqueName: \"kubernetes.io/projected/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-kube-api-access-tb8dg\") pod \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.180009 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-sg-core-conf-yaml\") pod \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.180040 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-log-httpd\") pod \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.180206 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-run-httpd\") pod \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.180286 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-config-data\") pod \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.180421 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-combined-ca-bundle\") pod \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\" (UID: \"2bd3c7f8-3b78-41c9-8265-bda65d7540b8\") " Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.180674 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2bd3c7f8-3b78-41c9-8265-bda65d7540b8" (UID: "2bd3c7f8-3b78-41c9-8265-bda65d7540b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.181204 4788 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.181213 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2bd3c7f8-3b78-41c9-8265-bda65d7540b8" (UID: "2bd3c7f8-3b78-41c9-8265-bda65d7540b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.185394 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-kube-api-access-tb8dg" (OuterVolumeSpecName: "kube-api-access-tb8dg") pod "2bd3c7f8-3b78-41c9-8265-bda65d7540b8" (UID: "2bd3c7f8-3b78-41c9-8265-bda65d7540b8"). InnerVolumeSpecName "kube-api-access-tb8dg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.188914 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-scripts" (OuterVolumeSpecName: "scripts") pod "2bd3c7f8-3b78-41c9-8265-bda65d7540b8" (UID: "2bd3c7f8-3b78-41c9-8265-bda65d7540b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.227099 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2bd3c7f8-3b78-41c9-8265-bda65d7540b8" (UID: "2bd3c7f8-3b78-41c9-8265-bda65d7540b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.283664 4788 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.283701 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.283711 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb8dg\" (UniqueName: \"kubernetes.io/projected/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-kube-api-access-tb8dg\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.283721 4788 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.288060 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-config-data" (OuterVolumeSpecName: "config-data") pod "2bd3c7f8-3b78-41c9-8265-bda65d7540b8" (UID: "2bd3c7f8-3b78-41c9-8265-bda65d7540b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.288836 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xcz8"] Feb 19 09:04:28 crc kubenswrapper[4788]: W0219 09:04:28.291620 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b5fef27_0741_4f5a_9a12_fa6917cf16af.slice/crio-487a0cec5f51c0751f108336962a43e60f499374dceb2aab093186b5274e7cdc WatchSource:0}: Error finding container 487a0cec5f51c0751f108336962a43e60f499374dceb2aab093186b5274e7cdc: Status 404 returned error can't find the container with id 487a0cec5f51c0751f108336962a43e60f499374dceb2aab093186b5274e7cdc Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.293820 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bd3c7f8-3b78-41c9-8265-bda65d7540b8" (UID: "2bd3c7f8-3b78-41c9-8265-bda65d7540b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.299899 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7cd957e-8518-477b-bd07-02904043190a","Type":"ContainerStarted","Data":"4cd7f61967aad3369b09baaabef4cdb965241636b16fdaca6910dcda8bc373c0"} Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.307301 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef1b823e-57ef-4d43-8a80-01c3445f7c2c","Type":"ContainerStarted","Data":"755a3e4112f8e6d4b4d249520806490b5e7eac4118e846e460f57f2f0ea55134"} Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.311184 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21597275-5627-4c64-a397-b0477a0ff5eb","Type":"ContainerStarted","Data":"ad68b91bffb03079356ddce97da513f4f29d60b1168d5f76034b0e35f4b68db3"} Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.314344 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hwbvj" event={"ID":"2b48c414-8fa5-4654-b4c6-457650a816b4","Type":"ContainerStarted","Data":"1e28682a2bb344feef2f7fa689eee510e022cba1dd36cd73f323b4403d744975"} Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.322064 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2bd3c7f8-3b78-41c9-8265-bda65d7540b8","Type":"ContainerDied","Data":"8043cfa04b95e8e9eaefef688d9822959c42d9b0db400e878ba8283d053f6a92"} Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.322125 4788 scope.go:117] "RemoveContainer" containerID="986679b645b6aa54eddc48eb5a4484dfe0abf5f9a07c7070b17c0362e9399ae3" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.322291 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.328431 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f82971ff-0ab5-4ae4-8de7-73159394c022","Type":"ContainerStarted","Data":"116ed3607d3142147804194b62001eef08b769a93895b202faa0d892ec70bbc2"} Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.338892 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-hwbvj" podStartSLOduration=2.338873332 podStartE2EDuration="2.338873332s" podCreationTimestamp="2026-02-19 09:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:28.329166771 +0000 UTC m=+1170.317178273" watchObservedRunningTime="2026-02-19 09:04:28.338873332 +0000 UTC m=+1170.326884804" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.341631 4788 generic.go:334] "Generic (PLEG): container finished" podID="e055219d-2144-4750-8255-9bc573b74163" containerID="85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0" exitCode=0 Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.341685 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" event={"ID":"e055219d-2144-4750-8255-9bc573b74163","Type":"ContainerDied","Data":"85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0"} Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.341719 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" event={"ID":"e055219d-2144-4750-8255-9bc573b74163","Type":"ContainerStarted","Data":"864f3c952385a8b96978e7fb03fb8e947385c5e648bc7050a81c3877837aaea2"} Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.385528 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.385562 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bd3c7f8-3b78-41c9-8265-bda65d7540b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.568364 4788 scope.go:117] "RemoveContainer" containerID="a1fea8c70ea2af942b20f5a1943ce8cd02199112a9634e2db42227f4cfb5f5bc" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.583154 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.620710 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.643122 4788 scope.go:117] "RemoveContainer" containerID="c2acdb38a22b4421ec517a73398f462a5df06236b31665a50503e7c73569fbe9" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.647545 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:04:28 crc kubenswrapper[4788]: E0219 09:04:28.648046 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="ceilometer-central-agent" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.648074 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="ceilometer-central-agent" Feb 19 09:04:28 crc kubenswrapper[4788]: E0219 09:04:28.648096 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="ceilometer-notification-agent" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.648103 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="ceilometer-notification-agent" Feb 19 09:04:28 crc kubenswrapper[4788]: E0219 09:04:28.648117 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="sg-core" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.648128 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="sg-core" Feb 19 09:04:28 crc kubenswrapper[4788]: E0219 09:04:28.648147 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="proxy-httpd" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.648156 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="proxy-httpd" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.648406 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="sg-core" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.648423 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="ceilometer-central-agent" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.648439 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="ceilometer-notification-agent" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.648459 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" containerName="proxy-httpd" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.650405 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.653936 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.654198 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.660986 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.735766 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd3c7f8-3b78-41c9-8265-bda65d7540b8" path="/var/lib/kubelet/pods/2bd3c7f8-3b78-41c9-8265-bda65d7540b8/volumes" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.802369 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-run-httpd\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.802454 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.802478 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-scripts\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.802504 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.802566 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj62h\" (UniqueName: \"kubernetes.io/projected/423bc761-9d6c-4518-9668-30dc36cdd536-kube-api-access-kj62h\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.802644 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-log-httpd\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.802670 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-config-data\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.887292 4788 scope.go:117] "RemoveContainer" containerID="d1d5a1e1769baefff8ea6c4a6ca66358441f831b351968db5f3b401a1fa66fcf" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.904394 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-run-httpd\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.904477 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.904503 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-scripts\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.904544 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.904627 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj62h\" (UniqueName: \"kubernetes.io/projected/423bc761-9d6c-4518-9668-30dc36cdd536-kube-api-access-kj62h\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.904707 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-log-httpd\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.904734 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-config-data\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.905176 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-run-httpd\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.909431 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-config-data\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.910019 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-log-httpd\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.917971 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-scripts\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.927520 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.929898 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.933148 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj62h\" (UniqueName: \"kubernetes.io/projected/423bc761-9d6c-4518-9668-30dc36cdd536-kube-api-access-kj62h\") pod \"ceilometer-0\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " pod="openstack/ceilometer-0" Feb 19 09:04:28 crc kubenswrapper[4788]: I0219 09:04:28.977366 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:04:29 crc kubenswrapper[4788]: I0219 09:04:29.347497 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:04:29 crc kubenswrapper[4788]: I0219 09:04:29.356747 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" event={"ID":"1b5fef27-0741-4f5a-9a12-fa6917cf16af","Type":"ContainerStarted","Data":"5ae3e1f3005b2aa94da7d59bcd79e83657d3225550134345aa9ace61da634ba0"} Feb 19 09:04:29 crc kubenswrapper[4788]: I0219 09:04:29.357095 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" event={"ID":"1b5fef27-0741-4f5a-9a12-fa6917cf16af","Type":"ContainerStarted","Data":"487a0cec5f51c0751f108336962a43e60f499374dceb2aab093186b5274e7cdc"} Feb 19 09:04:29 crc kubenswrapper[4788]: I0219 09:04:29.375185 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" podStartSLOduration=2.3751637580000002 podStartE2EDuration="2.375163758s" podCreationTimestamp="2026-02-19 09:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:29.371758293 +0000 UTC m=+1171.359769785" watchObservedRunningTime="2026-02-19 09:04:29.375163758 +0000 UTC m=+1171.363175230" Feb 19 09:04:29 crc kubenswrapper[4788]: I0219 09:04:29.375694 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" event={"ID":"e055219d-2144-4750-8255-9bc573b74163","Type":"ContainerStarted","Data":"ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c"} Feb 19 09:04:29 crc kubenswrapper[4788]: I0219 09:04:29.375852 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:29 crc kubenswrapper[4788]: I0219 09:04:29.429718 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" podStartSLOduration=3.42969571 podStartE2EDuration="3.42969571s" podCreationTimestamp="2026-02-19 09:04:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:29.393784779 +0000 UTC m=+1171.381796261" watchObservedRunningTime="2026-02-19 09:04:29.42969571 +0000 UTC m=+1171.417707182" Feb 19 09:04:30 crc kubenswrapper[4788]: I0219 09:04:30.124495 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:30 crc kubenswrapper[4788]: I0219 09:04:30.131399 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:04:31 crc kubenswrapper[4788]: W0219 09:04:31.703723 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod423bc761_9d6c_4518_9668_30dc36cdd536.slice/crio-effe47ae70c8e297510610953035ec6390f8b9c5d51174f6f6b7f547d12dbda9 WatchSource:0}: Error finding container effe47ae70c8e297510610953035ec6390f8b9c5d51174f6f6b7f547d12dbda9: Status 404 returned error can't find the container with id effe47ae70c8e297510610953035ec6390f8b9c5d51174f6f6b7f547d12dbda9 Feb 19 09:04:32 crc kubenswrapper[4788]: I0219 09:04:32.402769 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerStarted","Data":"effe47ae70c8e297510610953035ec6390f8b9c5d51174f6f6b7f547d12dbda9"} Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.417790 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21597275-5627-4c64-a397-b0477a0ff5eb","Type":"ContainerStarted","Data":"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7"} Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.418116 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21597275-5627-4c64-a397-b0477a0ff5eb","Type":"ContainerStarted","Data":"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81"} Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.418235 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" containerName="nova-metadata-log" containerID="cri-o://4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81" gracePeriod=30 Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.418893 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" containerName="nova-metadata-metadata" containerID="cri-o://ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7" gracePeriod=30 Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.423973 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerStarted","Data":"c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850"} Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.426561 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f82971ff-0ab5-4ae4-8de7-73159394c022","Type":"ContainerStarted","Data":"6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae"} Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.426714 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f82971ff-0ab5-4ae4-8de7-73159394c022" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae" gracePeriod=30 Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.429910 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7cd957e-8518-477b-bd07-02904043190a","Type":"ContainerStarted","Data":"95bd2fff0c8ec80293c2bfe9fbace3c8a5166adbfd0953574ef94264e9325d3c"} Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.437922 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef1b823e-57ef-4d43-8a80-01c3445f7c2c","Type":"ContainerStarted","Data":"96dd64663f5f5db9e2bf53db859a4510b8487f7bd65bb58ed88ac495e57a16b7"} Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.437981 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef1b823e-57ef-4d43-8a80-01c3445f7c2c","Type":"ContainerStarted","Data":"c70831671c343fd180d6e976569a54d0b4c7b0b195a1443f22279f4aff8d858e"} Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.451637 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.120155112 podStartE2EDuration="7.451230976s" podCreationTimestamp="2026-02-19 09:04:26 +0000 UTC" firstStartedPulling="2026-02-19 09:04:27.770505933 +0000 UTC m=+1169.758517405" lastFinishedPulling="2026-02-19 09:04:32.101581787 +0000 UTC m=+1174.089593269" observedRunningTime="2026-02-19 09:04:33.438360097 +0000 UTC m=+1175.426371579" watchObservedRunningTime="2026-02-19 09:04:33.451230976 +0000 UTC m=+1175.439242448" Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.467098 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.014997583 podStartE2EDuration="7.467077969s" podCreationTimestamp="2026-02-19 09:04:26 +0000 UTC" firstStartedPulling="2026-02-19 09:04:27.642006675 +0000 UTC m=+1169.630018147" lastFinishedPulling="2026-02-19 09:04:32.094087061 +0000 UTC m=+1174.082098533" observedRunningTime="2026-02-19 09:04:33.457655826 +0000 UTC m=+1175.445667308" watchObservedRunningTime="2026-02-19 09:04:33.467077969 +0000 UTC m=+1175.455089441" Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.480723 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.937318955 podStartE2EDuration="7.480704506s" podCreationTimestamp="2026-02-19 09:04:26 +0000 UTC" firstStartedPulling="2026-02-19 09:04:27.552055684 +0000 UTC m=+1169.540067156" lastFinishedPulling="2026-02-19 09:04:32.095441225 +0000 UTC m=+1174.083452707" observedRunningTime="2026-02-19 09:04:33.47395293 +0000 UTC m=+1175.461964402" watchObservedRunningTime="2026-02-19 09:04:33.480704506 +0000 UTC m=+1175.468715968" Feb 19 09:04:33 crc kubenswrapper[4788]: I0219 09:04:33.496734 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.689710724 podStartE2EDuration="7.496711774s" podCreationTimestamp="2026-02-19 09:04:26 +0000 UTC" firstStartedPulling="2026-02-19 09:04:27.287105452 +0000 UTC m=+1169.275116924" lastFinishedPulling="2026-02-19 09:04:32.094106502 +0000 UTC m=+1174.082117974" observedRunningTime="2026-02-19 09:04:33.492779646 +0000 UTC m=+1175.480791128" watchObservedRunningTime="2026-02-19 09:04:33.496711774 +0000 UTC m=+1175.484723246" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.048276 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.213900 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21597275-5627-4c64-a397-b0477a0ff5eb-logs\") pod \"21597275-5627-4c64-a397-b0477a0ff5eb\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.214026 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-config-data\") pod \"21597275-5627-4c64-a397-b0477a0ff5eb\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.214227 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twfkk\" (UniqueName: \"kubernetes.io/projected/21597275-5627-4c64-a397-b0477a0ff5eb-kube-api-access-twfkk\") pod \"21597275-5627-4c64-a397-b0477a0ff5eb\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.214309 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21597275-5627-4c64-a397-b0477a0ff5eb-logs" (OuterVolumeSpecName: "logs") pod "21597275-5627-4c64-a397-b0477a0ff5eb" (UID: "21597275-5627-4c64-a397-b0477a0ff5eb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.214352 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-combined-ca-bundle\") pod \"21597275-5627-4c64-a397-b0477a0ff5eb\" (UID: \"21597275-5627-4c64-a397-b0477a0ff5eb\") " Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.215481 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21597275-5627-4c64-a397-b0477a0ff5eb-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.233558 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21597275-5627-4c64-a397-b0477a0ff5eb-kube-api-access-twfkk" (OuterVolumeSpecName: "kube-api-access-twfkk") pod "21597275-5627-4c64-a397-b0477a0ff5eb" (UID: "21597275-5627-4c64-a397-b0477a0ff5eb"). InnerVolumeSpecName "kube-api-access-twfkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.252752 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-config-data" (OuterVolumeSpecName: "config-data") pod "21597275-5627-4c64-a397-b0477a0ff5eb" (UID: "21597275-5627-4c64-a397-b0477a0ff5eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.272172 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21597275-5627-4c64-a397-b0477a0ff5eb" (UID: "21597275-5627-4c64-a397-b0477a0ff5eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.317826 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.318493 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twfkk\" (UniqueName: \"kubernetes.io/projected/21597275-5627-4c64-a397-b0477a0ff5eb-kube-api-access-twfkk\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.318617 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21597275-5627-4c64-a397-b0477a0ff5eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.449700 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerStarted","Data":"53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0"} Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.490037 4788 generic.go:334] "Generic (PLEG): container finished" podID="21597275-5627-4c64-a397-b0477a0ff5eb" containerID="ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7" exitCode=0 Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.490082 4788 generic.go:334] "Generic (PLEG): container finished" podID="21597275-5627-4c64-a397-b0477a0ff5eb" containerID="4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81" exitCode=143 Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.490387 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21597275-5627-4c64-a397-b0477a0ff5eb","Type":"ContainerDied","Data":"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7"} Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.490437 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21597275-5627-4c64-a397-b0477a0ff5eb","Type":"ContainerDied","Data":"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81"} Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.490452 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21597275-5627-4c64-a397-b0477a0ff5eb","Type":"ContainerDied","Data":"ad68b91bffb03079356ddce97da513f4f29d60b1168d5f76034b0e35f4b68db3"} Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.490472 4788 scope.go:117] "RemoveContainer" containerID="ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.490649 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.527680 4788 scope.go:117] "RemoveContainer" containerID="4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.544340 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.558062 4788 scope.go:117] "RemoveContainer" containerID="ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7" Feb 19 09:04:34 crc kubenswrapper[4788]: E0219 09:04:34.565211 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7\": container with ID starting with ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7 not found: ID does not exist" containerID="ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.565289 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7"} err="failed to get container status \"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7\": rpc error: code = NotFound desc = could not find container \"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7\": container with ID starting with ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7 not found: ID does not exist" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.565321 4788 scope.go:117] "RemoveContainer" containerID="4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.565441 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:34 crc kubenswrapper[4788]: E0219 09:04:34.569438 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81\": container with ID starting with 4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81 not found: ID does not exist" containerID="4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.569488 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81"} err="failed to get container status \"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81\": rpc error: code = NotFound desc = could not find container \"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81\": container with ID starting with 4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81 not found: ID does not exist" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.569520 4788 scope.go:117] "RemoveContainer" containerID="ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.569862 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7"} err="failed to get container status \"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7\": rpc error: code = NotFound desc = could not find container \"ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7\": container with ID starting with ca314a4be8aa68fd6ea11d3a27b962b708903063dbb18d522505dda092d447c7 not found: ID does not exist" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.569890 4788 scope.go:117] "RemoveContainer" containerID="4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.572554 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81"} err="failed to get container status \"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81\": rpc error: code = NotFound desc = could not find container \"4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81\": container with ID starting with 4d955ea129bc33ad93179a8c96129f50ee062461568ab0db530bd49212078d81 not found: ID does not exist" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.574661 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:34 crc kubenswrapper[4788]: E0219 09:04:34.575035 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" containerName="nova-metadata-metadata" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.575054 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" containerName="nova-metadata-metadata" Feb 19 09:04:34 crc kubenswrapper[4788]: E0219 09:04:34.575086 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" containerName="nova-metadata-log" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.575094 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" containerName="nova-metadata-log" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.575290 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" containerName="nova-metadata-log" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.575320 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" containerName="nova-metadata-metadata" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.576486 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.586787 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.586959 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.595520 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.728836 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21597275-5627-4c64-a397-b0477a0ff5eb" path="/var/lib/kubelet/pods/21597275-5627-4c64-a397-b0477a0ff5eb/volumes" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.730797 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.730877 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60202d64-d839-48f5-9656-6c58eef9dc86-logs\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.730934 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5qv6\" (UniqueName: \"kubernetes.io/projected/60202d64-d839-48f5-9656-6c58eef9dc86-kube-api-access-l5qv6\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.731005 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.731158 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-config-data\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.832489 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60202d64-d839-48f5-9656-6c58eef9dc86-logs\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.832559 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5qv6\" (UniqueName: \"kubernetes.io/projected/60202d64-d839-48f5-9656-6c58eef9dc86-kube-api-access-l5qv6\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.832590 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.832783 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-config-data\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.832838 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.833008 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60202d64-d839-48f5-9656-6c58eef9dc86-logs\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.839101 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-config-data\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.839227 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.839848 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.859157 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5qv6\" (UniqueName: \"kubernetes.io/projected/60202d64-d839-48f5-9656-6c58eef9dc86-kube-api-access-l5qv6\") pod \"nova-metadata-0\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " pod="openstack/nova-metadata-0" Feb 19 09:04:34 crc kubenswrapper[4788]: I0219 09:04:34.913538 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:35 crc kubenswrapper[4788]: I0219 09:04:35.505528 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerStarted","Data":"aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c"} Feb 19 09:04:35 crc kubenswrapper[4788]: I0219 09:04:35.533344 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:35 crc kubenswrapper[4788]: W0219 09:04:35.540025 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60202d64_d839_48f5_9656_6c58eef9dc86.slice/crio-104c153c72d442f4f6f6ec06a43c30f868d132c72be2163883812d0cb091cbf0 WatchSource:0}: Error finding container 104c153c72d442f4f6f6ec06a43c30f868d132c72be2163883812d0cb091cbf0: Status 404 returned error can't find the container with id 104c153c72d442f4f6f6ec06a43c30f868d132c72be2163883812d0cb091cbf0 Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.514986 4788 generic.go:334] "Generic (PLEG): container finished" podID="2b48c414-8fa5-4654-b4c6-457650a816b4" containerID="1e28682a2bb344feef2f7fa689eee510e022cba1dd36cd73f323b4403d744975" exitCode=0 Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.515087 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hwbvj" event={"ID":"2b48c414-8fa5-4654-b4c6-457650a816b4","Type":"ContainerDied","Data":"1e28682a2bb344feef2f7fa689eee510e022cba1dd36cd73f323b4403d744975"} Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.518139 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60202d64-d839-48f5-9656-6c58eef9dc86","Type":"ContainerStarted","Data":"ab032eebaa648366b54360a4379ada4f9c40b6b944ec02c2338a50e26ce0a054"} Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.518176 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60202d64-d839-48f5-9656-6c58eef9dc86","Type":"ContainerStarted","Data":"03c010f3555fa97f65bbdac9cde71ea55af050dd3e8b5db3087389c44348646a"} Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.518189 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60202d64-d839-48f5-9656-6c58eef9dc86","Type":"ContainerStarted","Data":"104c153c72d442f4f6f6ec06a43c30f868d132c72be2163883812d0cb091cbf0"} Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.561626 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.56159891 podStartE2EDuration="2.56159891s" podCreationTimestamp="2026-02-19 09:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:36.554097984 +0000 UTC m=+1178.542109466" watchObservedRunningTime="2026-02-19 09:04:36.56159891 +0000 UTC m=+1178.549610412" Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.570101 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.570143 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.603522 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.915784 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 09:04:36 crc kubenswrapper[4788]: I0219 09:04:36.915851 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.023557 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.088347 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.198122 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-ttjbs"] Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.198409 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" podUID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerName="dnsmasq-dns" containerID="cri-o://6119ccfe1a249c33c4e565dd52aab75efb891b491fb1347d6ac5e1c7cb311cd5" gracePeriod=10 Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.270086 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" podUID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.182:5353: connect: connection refused" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.546592 4788 generic.go:334] "Generic (PLEG): container finished" podID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerID="6119ccfe1a249c33c4e565dd52aab75efb891b491fb1347d6ac5e1c7cb311cd5" exitCode=0 Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.546910 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" event={"ID":"163969c1-3ad0-4173-ac14-6ef793fa8f13","Type":"ContainerDied","Data":"6119ccfe1a249c33c4e565dd52aab75efb891b491fb1347d6ac5e1c7cb311cd5"} Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.556232 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerStarted","Data":"6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95"} Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.557316 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.587013 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.622727903 podStartE2EDuration="9.586987415s" podCreationTimestamp="2026-02-19 09:04:28 +0000 UTC" firstStartedPulling="2026-02-19 09:04:32.046597963 +0000 UTC m=+1174.034609475" lastFinishedPulling="2026-02-19 09:04:37.010857515 +0000 UTC m=+1178.998868987" observedRunningTime="2026-02-19 09:04:37.579360626 +0000 UTC m=+1179.567372118" watchObservedRunningTime="2026-02-19 09:04:37.586987415 +0000 UTC m=+1179.574998897" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.623545 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.739844 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.858280 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-config\") pod \"163969c1-3ad0-4173-ac14-6ef793fa8f13\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.858383 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-svc\") pod \"163969c1-3ad0-4173-ac14-6ef793fa8f13\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.858425 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg6w9\" (UniqueName: \"kubernetes.io/projected/163969c1-3ad0-4173-ac14-6ef793fa8f13-kube-api-access-kg6w9\") pod \"163969c1-3ad0-4173-ac14-6ef793fa8f13\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.858578 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-swift-storage-0\") pod \"163969c1-3ad0-4173-ac14-6ef793fa8f13\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.858595 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-nb\") pod \"163969c1-3ad0-4173-ac14-6ef793fa8f13\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.858624 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-sb\") pod \"163969c1-3ad0-4173-ac14-6ef793fa8f13\" (UID: \"163969c1-3ad0-4173-ac14-6ef793fa8f13\") " Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.895359 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/163969c1-3ad0-4173-ac14-6ef793fa8f13-kube-api-access-kg6w9" (OuterVolumeSpecName: "kube-api-access-kg6w9") pod "163969c1-3ad0-4173-ac14-6ef793fa8f13" (UID: "163969c1-3ad0-4173-ac14-6ef793fa8f13"). InnerVolumeSpecName "kube-api-access-kg6w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.914410 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "163969c1-3ad0-4173-ac14-6ef793fa8f13" (UID: "163969c1-3ad0-4173-ac14-6ef793fa8f13"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.951859 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "163969c1-3ad0-4173-ac14-6ef793fa8f13" (UID: "163969c1-3ad0-4173-ac14-6ef793fa8f13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.956822 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-config" (OuterVolumeSpecName: "config") pod "163969c1-3ad0-4173-ac14-6ef793fa8f13" (UID: "163969c1-3ad0-4173-ac14-6ef793fa8f13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.961541 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg6w9\" (UniqueName: \"kubernetes.io/projected/163969c1-3ad0-4173-ac14-6ef793fa8f13-kube-api-access-kg6w9\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.961582 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.961595 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.961606 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.967912 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "163969c1-3ad0-4173-ac14-6ef793fa8f13" (UID: "163969c1-3ad0-4173-ac14-6ef793fa8f13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.969848 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "163969c1-3ad0-4173-ac14-6ef793fa8f13" (UID: "163969c1-3ad0-4173-ac14-6ef793fa8f13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:04:37 crc kubenswrapper[4788]: I0219 09:04:37.989719 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.000429 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.000485 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.066063 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.066113 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/163969c1-3ad0-4173-ac14-6ef793fa8f13-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.168678 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-combined-ca-bundle\") pod \"2b48c414-8fa5-4654-b4c6-457650a816b4\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.168770 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-config-data\") pod \"2b48c414-8fa5-4654-b4c6-457650a816b4\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.168932 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-scripts\") pod \"2b48c414-8fa5-4654-b4c6-457650a816b4\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.169051 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8zkm\" (UniqueName: \"kubernetes.io/projected/2b48c414-8fa5-4654-b4c6-457650a816b4-kube-api-access-w8zkm\") pod \"2b48c414-8fa5-4654-b4c6-457650a816b4\" (UID: \"2b48c414-8fa5-4654-b4c6-457650a816b4\") " Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.175073 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-scripts" (OuterVolumeSpecName: "scripts") pod "2b48c414-8fa5-4654-b4c6-457650a816b4" (UID: "2b48c414-8fa5-4654-b4c6-457650a816b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.178491 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b48c414-8fa5-4654-b4c6-457650a816b4-kube-api-access-w8zkm" (OuterVolumeSpecName: "kube-api-access-w8zkm") pod "2b48c414-8fa5-4654-b4c6-457650a816b4" (UID: "2b48c414-8fa5-4654-b4c6-457650a816b4"). InnerVolumeSpecName "kube-api-access-w8zkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.208367 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-config-data" (OuterVolumeSpecName: "config-data") pod "2b48c414-8fa5-4654-b4c6-457650a816b4" (UID: "2b48c414-8fa5-4654-b4c6-457650a816b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.222988 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b48c414-8fa5-4654-b4c6-457650a816b4" (UID: "2b48c414-8fa5-4654-b4c6-457650a816b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.271697 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.271735 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8zkm\" (UniqueName: \"kubernetes.io/projected/2b48c414-8fa5-4654-b4c6-457650a816b4-kube-api-access-w8zkm\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.271750 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.271764 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b48c414-8fa5-4654-b4c6-457650a816b4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.566516 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" event={"ID":"163969c1-3ad0-4173-ac14-6ef793fa8f13","Type":"ContainerDied","Data":"6f9b3064b63d6ffe3478cb305072f83378331aa278d1b9136c6e24674db16629"} Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.566525 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-ttjbs" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.566572 4788 scope.go:117] "RemoveContainer" containerID="6119ccfe1a249c33c4e565dd52aab75efb891b491fb1347d6ac5e1c7cb311cd5" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.569748 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hwbvj" event={"ID":"2b48c414-8fa5-4654-b4c6-457650a816b4","Type":"ContainerDied","Data":"05a75356ff6eec55c72a7e9af29b70357ffa3d6d6da3f440afc565cc23be83e0"} Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.569806 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05a75356ff6eec55c72a7e9af29b70357ffa3d6d6da3f440afc565cc23be83e0" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.569875 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hwbvj" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.585895 4788 scope.go:117] "RemoveContainer" containerID="fb5056a71bee8bc419c6855dc853b73663ccd072c837967f551b8c30e64a2dd1" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.618580 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-ttjbs"] Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.631327 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-ttjbs"] Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.726235 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="163969c1-3ad0-4173-ac14-6ef793fa8f13" path="/var/lib/kubelet/pods/163969c1-3ad0-4173-ac14-6ef793fa8f13/volumes" Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.727053 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.727092 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.727365 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-log" containerID="cri-o://c70831671c343fd180d6e976569a54d0b4c7b0b195a1443f22279f4aff8d858e" gracePeriod=30 Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.727760 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-api" containerID="cri-o://96dd64663f5f5db9e2bf53db859a4510b8487f7bd65bb58ed88ac495e57a16b7" gracePeriod=30 Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.745131 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.745609 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" containerName="nova-metadata-log" containerID="cri-o://03c010f3555fa97f65bbdac9cde71ea55af050dd3e8b5db3087389c44348646a" gracePeriod=30 Feb 19 09:04:38 crc kubenswrapper[4788]: I0219 09:04:38.746222 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" containerName="nova-metadata-metadata" containerID="cri-o://ab032eebaa648366b54360a4379ada4f9c40b6b944ec02c2338a50e26ce0a054" gracePeriod=30 Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.598138 4788 generic.go:334] "Generic (PLEG): container finished" podID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerID="c70831671c343fd180d6e976569a54d0b4c7b0b195a1443f22279f4aff8d858e" exitCode=143 Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.598311 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef1b823e-57ef-4d43-8a80-01c3445f7c2c","Type":"ContainerDied","Data":"c70831671c343fd180d6e976569a54d0b4c7b0b195a1443f22279f4aff8d858e"} Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.602678 4788 generic.go:334] "Generic (PLEG): container finished" podID="60202d64-d839-48f5-9656-6c58eef9dc86" containerID="ab032eebaa648366b54360a4379ada4f9c40b6b944ec02c2338a50e26ce0a054" exitCode=0 Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.602703 4788 generic.go:334] "Generic (PLEG): container finished" podID="60202d64-d839-48f5-9656-6c58eef9dc86" containerID="03c010f3555fa97f65bbdac9cde71ea55af050dd3e8b5db3087389c44348646a" exitCode=143 Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.602765 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60202d64-d839-48f5-9656-6c58eef9dc86","Type":"ContainerDied","Data":"ab032eebaa648366b54360a4379ada4f9c40b6b944ec02c2338a50e26ce0a054"} Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.602803 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60202d64-d839-48f5-9656-6c58eef9dc86","Type":"ContainerDied","Data":"03c010f3555fa97f65bbdac9cde71ea55af050dd3e8b5db3087389c44348646a"} Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.602830 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"60202d64-d839-48f5-9656-6c58eef9dc86","Type":"ContainerDied","Data":"104c153c72d442f4f6f6ec06a43c30f868d132c72be2163883812d0cb091cbf0"} Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.602842 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="104c153c72d442f4f6f6ec06a43c30f868d132c72be2163883812d0cb091cbf0" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.602856 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a7cd957e-8518-477b-bd07-02904043190a" containerName="nova-scheduler-scheduler" containerID="cri-o://95bd2fff0c8ec80293c2bfe9fbace3c8a5166adbfd0953574ef94264e9325d3c" gracePeriod=30 Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.620091 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.702411 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-config-data\") pod \"60202d64-d839-48f5-9656-6c58eef9dc86\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.702514 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-combined-ca-bundle\") pod \"60202d64-d839-48f5-9656-6c58eef9dc86\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.702557 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-nova-metadata-tls-certs\") pod \"60202d64-d839-48f5-9656-6c58eef9dc86\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.702628 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5qv6\" (UniqueName: \"kubernetes.io/projected/60202d64-d839-48f5-9656-6c58eef9dc86-kube-api-access-l5qv6\") pod \"60202d64-d839-48f5-9656-6c58eef9dc86\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.702699 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60202d64-d839-48f5-9656-6c58eef9dc86-logs\") pod \"60202d64-d839-48f5-9656-6c58eef9dc86\" (UID: \"60202d64-d839-48f5-9656-6c58eef9dc86\") " Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.703328 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60202d64-d839-48f5-9656-6c58eef9dc86-logs" (OuterVolumeSpecName: "logs") pod "60202d64-d839-48f5-9656-6c58eef9dc86" (UID: "60202d64-d839-48f5-9656-6c58eef9dc86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.703617 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60202d64-d839-48f5-9656-6c58eef9dc86-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.721856 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60202d64-d839-48f5-9656-6c58eef9dc86-kube-api-access-l5qv6" (OuterVolumeSpecName: "kube-api-access-l5qv6") pod "60202d64-d839-48f5-9656-6c58eef9dc86" (UID: "60202d64-d839-48f5-9656-6c58eef9dc86"). InnerVolumeSpecName "kube-api-access-l5qv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.741375 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60202d64-d839-48f5-9656-6c58eef9dc86" (UID: "60202d64-d839-48f5-9656-6c58eef9dc86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.744799 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-config-data" (OuterVolumeSpecName: "config-data") pod "60202d64-d839-48f5-9656-6c58eef9dc86" (UID: "60202d64-d839-48f5-9656-6c58eef9dc86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.805225 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.805279 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5qv6\" (UniqueName: \"kubernetes.io/projected/60202d64-d839-48f5-9656-6c58eef9dc86-kube-api-access-l5qv6\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.805293 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.819096 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "60202d64-d839-48f5-9656-6c58eef9dc86" (UID: "60202d64-d839-48f5-9656-6c58eef9dc86"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:39 crc kubenswrapper[4788]: I0219 09:04:39.907396 4788 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/60202d64-d839-48f5-9656-6c58eef9dc86-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.618566 4788 generic.go:334] "Generic (PLEG): container finished" podID="a7cd957e-8518-477b-bd07-02904043190a" containerID="95bd2fff0c8ec80293c2bfe9fbace3c8a5166adbfd0953574ef94264e9325d3c" exitCode=0 Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.618705 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7cd957e-8518-477b-bd07-02904043190a","Type":"ContainerDied","Data":"95bd2fff0c8ec80293c2bfe9fbace3c8a5166adbfd0953574ef94264e9325d3c"} Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.618975 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a7cd957e-8518-477b-bd07-02904043190a","Type":"ContainerDied","Data":"4cd7f61967aad3369b09baaabef4cdb965241636b16fdaca6910dcda8bc373c0"} Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.618988 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd7f61967aad3369b09baaabef4cdb965241636b16fdaca6910dcda8bc373c0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.619109 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.694162 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.730761 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.753192 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.770646 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:40 crc kubenswrapper[4788]: E0219 09:04:40.771227 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b48c414-8fa5-4654-b4c6-457650a816b4" containerName="nova-manage" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.771314 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b48c414-8fa5-4654-b4c6-457650a816b4" containerName="nova-manage" Feb 19 09:04:40 crc kubenswrapper[4788]: E0219 09:04:40.771374 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" containerName="nova-metadata-log" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.771386 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" containerName="nova-metadata-log" Feb 19 09:04:40 crc kubenswrapper[4788]: E0219 09:04:40.771435 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerName="dnsmasq-dns" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.771447 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerName="dnsmasq-dns" Feb 19 09:04:40 crc kubenswrapper[4788]: E0219 09:04:40.771488 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7cd957e-8518-477b-bd07-02904043190a" containerName="nova-scheduler-scheduler" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.771536 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7cd957e-8518-477b-bd07-02904043190a" containerName="nova-scheduler-scheduler" Feb 19 09:04:40 crc kubenswrapper[4788]: E0219 09:04:40.771571 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerName="init" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.771595 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerName="init" Feb 19 09:04:40 crc kubenswrapper[4788]: E0219 09:04:40.771611 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" containerName="nova-metadata-metadata" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.771620 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" containerName="nova-metadata-metadata" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.772466 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" containerName="nova-metadata-metadata" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.772508 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7cd957e-8518-477b-bd07-02904043190a" containerName="nova-scheduler-scheduler" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.772542 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" containerName="nova-metadata-log" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.772603 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="163969c1-3ad0-4173-ac14-6ef793fa8f13" containerName="dnsmasq-dns" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.772640 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b48c414-8fa5-4654-b4c6-457650a816b4" containerName="nova-manage" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.774164 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.780398 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.782556 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.831515 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.832776 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-config-data\") pod \"a7cd957e-8518-477b-bd07-02904043190a\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.832955 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-combined-ca-bundle\") pod \"a7cd957e-8518-477b-bd07-02904043190a\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.833038 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdld2\" (UniqueName: \"kubernetes.io/projected/a7cd957e-8518-477b-bd07-02904043190a-kube-api-access-wdld2\") pod \"a7cd957e-8518-477b-bd07-02904043190a\" (UID: \"a7cd957e-8518-477b-bd07-02904043190a\") " Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.842014 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7cd957e-8518-477b-bd07-02904043190a-kube-api-access-wdld2" (OuterVolumeSpecName: "kube-api-access-wdld2") pod "a7cd957e-8518-477b-bd07-02904043190a" (UID: "a7cd957e-8518-477b-bd07-02904043190a"). InnerVolumeSpecName "kube-api-access-wdld2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.862549 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7cd957e-8518-477b-bd07-02904043190a" (UID: "a7cd957e-8518-477b-bd07-02904043190a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.864140 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-config-data" (OuterVolumeSpecName: "config-data") pod "a7cd957e-8518-477b-bd07-02904043190a" (UID: "a7cd957e-8518-477b-bd07-02904043190a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.936101 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.936365 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.936519 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-config-data\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.936621 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3812032-b03d-4773-bb95-7ccbb252b7de-logs\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.936680 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl2cr\" (UniqueName: \"kubernetes.io/projected/e3812032-b03d-4773-bb95-7ccbb252b7de-kube-api-access-rl2cr\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.936802 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.936825 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cd957e-8518-477b-bd07-02904043190a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:40 crc kubenswrapper[4788]: I0219 09:04:40.936841 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdld2\" (UniqueName: \"kubernetes.io/projected/a7cd957e-8518-477b-bd07-02904043190a-kube-api-access-wdld2\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.038288 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.038620 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.038670 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-config-data\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.038732 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3812032-b03d-4773-bb95-7ccbb252b7de-logs\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.038765 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl2cr\" (UniqueName: \"kubernetes.io/projected/e3812032-b03d-4773-bb95-7ccbb252b7de-kube-api-access-rl2cr\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.039737 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3812032-b03d-4773-bb95-7ccbb252b7de-logs\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.045158 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.046072 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-config-data\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.048506 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.058281 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl2cr\" (UniqueName: \"kubernetes.io/projected/e3812032-b03d-4773-bb95-7ccbb252b7de-kube-api-access-rl2cr\") pod \"nova-metadata-0\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.097595 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.579427 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:04:41 crc kubenswrapper[4788]: W0219 09:04:41.584472 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3812032_b03d_4773_bb95_7ccbb252b7de.slice/crio-17e7d9c44446319f661c66c8f2f07214532fa665007f695c1362eecd3a43cdf7 WatchSource:0}: Error finding container 17e7d9c44446319f661c66c8f2f07214532fa665007f695c1362eecd3a43cdf7: Status 404 returned error can't find the container with id 17e7d9c44446319f661c66c8f2f07214532fa665007f695c1362eecd3a43cdf7 Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.628728 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3812032-b03d-4773-bb95-7ccbb252b7de","Type":"ContainerStarted","Data":"17e7d9c44446319f661c66c8f2f07214532fa665007f695c1362eecd3a43cdf7"} Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.628756 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.686942 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.699824 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.713614 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.715159 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.717428 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.740793 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.855657 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.855841 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlr9c\" (UniqueName: \"kubernetes.io/projected/cb45b82b-6e45-4780-a2e7-99dcddc48974-kube-api-access-mlr9c\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.856114 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-config-data\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.957371 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.957424 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlr9c\" (UniqueName: \"kubernetes.io/projected/cb45b82b-6e45-4780-a2e7-99dcddc48974-kube-api-access-mlr9c\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.957524 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-config-data\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.962379 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.964701 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-config-data\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:41 crc kubenswrapper[4788]: I0219 09:04:41.975430 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlr9c\" (UniqueName: \"kubernetes.io/projected/cb45b82b-6e45-4780-a2e7-99dcddc48974-kube-api-access-mlr9c\") pod \"nova-scheduler-0\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " pod="openstack/nova-scheduler-0" Feb 19 09:04:42 crc kubenswrapper[4788]: I0219 09:04:42.037977 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:04:42 crc kubenswrapper[4788]: I0219 09:04:42.461730 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:04:42 crc kubenswrapper[4788]: I0219 09:04:42.642369 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb45b82b-6e45-4780-a2e7-99dcddc48974","Type":"ContainerStarted","Data":"a8f3de6fbdd734033333da5218c4569baa79a42e6b32aace7751563aa3fd530d"} Feb 19 09:04:42 crc kubenswrapper[4788]: I0219 09:04:42.644753 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3812032-b03d-4773-bb95-7ccbb252b7de","Type":"ContainerStarted","Data":"37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc"} Feb 19 09:04:42 crc kubenswrapper[4788]: I0219 09:04:42.644788 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3812032-b03d-4773-bb95-7ccbb252b7de","Type":"ContainerStarted","Data":"697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084"} Feb 19 09:04:42 crc kubenswrapper[4788]: I0219 09:04:42.674112 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.674084443 podStartE2EDuration="2.674084443s" podCreationTimestamp="2026-02-19 09:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:42.663993023 +0000 UTC m=+1184.652004515" watchObservedRunningTime="2026-02-19 09:04:42.674084443 +0000 UTC m=+1184.662095915" Feb 19 09:04:42 crc kubenswrapper[4788]: I0219 09:04:42.731828 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60202d64-d839-48f5-9656-6c58eef9dc86" path="/var/lib/kubelet/pods/60202d64-d839-48f5-9656-6c58eef9dc86/volumes" Feb 19 09:04:42 crc kubenswrapper[4788]: I0219 09:04:42.735304 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7cd957e-8518-477b-bd07-02904043190a" path="/var/lib/kubelet/pods/a7cd957e-8518-477b-bd07-02904043190a/volumes" Feb 19 09:04:43 crc kubenswrapper[4788]: I0219 09:04:43.655938 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb45b82b-6e45-4780-a2e7-99dcddc48974","Type":"ContainerStarted","Data":"c77958088275cc498f43c2050efd4a95057a16e5178fb4c4224d8ac2ed6ee44d"} Feb 19 09:04:43 crc kubenswrapper[4788]: I0219 09:04:43.672543 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.67251546 podStartE2EDuration="2.67251546s" podCreationTimestamp="2026-02-19 09:04:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:43.671358421 +0000 UTC m=+1185.659369903" watchObservedRunningTime="2026-02-19 09:04:43.67251546 +0000 UTC m=+1185.660526952" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.703677 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef1b823e-57ef-4d43-8a80-01c3445f7c2c","Type":"ContainerDied","Data":"96dd64663f5f5db9e2bf53db859a4510b8487f7bd65bb58ed88ac495e57a16b7"} Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.703598 4788 generic.go:334] "Generic (PLEG): container finished" podID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerID="96dd64663f5f5db9e2bf53db859a4510b8487f7bd65bb58ed88ac495e57a16b7" exitCode=0 Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.705017 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef1b823e-57ef-4d43-8a80-01c3445f7c2c","Type":"ContainerDied","Data":"755a3e4112f8e6d4b4d249520806490b5e7eac4118e846e460f57f2f0ea55134"} Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.705040 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="755a3e4112f8e6d4b4d249520806490b5e7eac4118e846e460f57f2f0ea55134" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.718876 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.816851 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-config-data\") pod \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.817357 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpzck\" (UniqueName: \"kubernetes.io/projected/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-kube-api-access-dpzck\") pod \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.817514 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-combined-ca-bundle\") pod \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.817624 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-logs\") pod \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\" (UID: \"ef1b823e-57ef-4d43-8a80-01c3445f7c2c\") " Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.820559 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-logs" (OuterVolumeSpecName: "logs") pod "ef1b823e-57ef-4d43-8a80-01c3445f7c2c" (UID: "ef1b823e-57ef-4d43-8a80-01c3445f7c2c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.822531 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-kube-api-access-dpzck" (OuterVolumeSpecName: "kube-api-access-dpzck") pod "ef1b823e-57ef-4d43-8a80-01c3445f7c2c" (UID: "ef1b823e-57ef-4d43-8a80-01c3445f7c2c"). InnerVolumeSpecName "kube-api-access-dpzck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.856028 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef1b823e-57ef-4d43-8a80-01c3445f7c2c" (UID: "ef1b823e-57ef-4d43-8a80-01c3445f7c2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.859526 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-config-data" (OuterVolumeSpecName: "config-data") pod "ef1b823e-57ef-4d43-8a80-01c3445f7c2c" (UID: "ef1b823e-57ef-4d43-8a80-01c3445f7c2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.920778 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.920815 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpzck\" (UniqueName: \"kubernetes.io/projected/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-kube-api-access-dpzck\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.920826 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:44 crc kubenswrapper[4788]: I0219 09:04:44.920834 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef1b823e-57ef-4d43-8a80-01c3445f7c2c-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.713894 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.759555 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.773506 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.796224 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:45 crc kubenswrapper[4788]: E0219 09:04:45.796705 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-api" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.796729 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-api" Feb 19 09:04:45 crc kubenswrapper[4788]: E0219 09:04:45.796771 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-log" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.796780 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-log" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.797028 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-log" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.797058 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" containerName="nova-api-api" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.798228 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.801315 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.810682 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.838336 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-config-data\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.838460 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.838673 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead14dd-01c7-49b3-9509-413bf05d77e6-logs\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.839225 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d2xh\" (UniqueName: \"kubernetes.io/projected/2ead14dd-01c7-49b3-9509-413bf05d77e6-kube-api-access-6d2xh\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.941141 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead14dd-01c7-49b3-9509-413bf05d77e6-logs\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.941203 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d2xh\" (UniqueName: \"kubernetes.io/projected/2ead14dd-01c7-49b3-9509-413bf05d77e6-kube-api-access-6d2xh\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.941385 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-config-data\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.941426 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.941726 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead14dd-01c7-49b3-9509-413bf05d77e6-logs\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.947366 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-config-data\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.947395 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:45 crc kubenswrapper[4788]: I0219 09:04:45.966048 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d2xh\" (UniqueName: \"kubernetes.io/projected/2ead14dd-01c7-49b3-9509-413bf05d77e6-kube-api-access-6d2xh\") pod \"nova-api-0\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " pod="openstack/nova-api-0" Feb 19 09:04:46 crc kubenswrapper[4788]: I0219 09:04:46.097847 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 09:04:46 crc kubenswrapper[4788]: I0219 09:04:46.097914 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 09:04:46 crc kubenswrapper[4788]: I0219 09:04:46.133731 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:04:46 crc kubenswrapper[4788]: I0219 09:04:46.550012 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:04:46 crc kubenswrapper[4788]: I0219 09:04:46.727536 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef1b823e-57ef-4d43-8a80-01c3445f7c2c" path="/var/lib/kubelet/pods/ef1b823e-57ef-4d43-8a80-01c3445f7c2c/volumes" Feb 19 09:04:46 crc kubenswrapper[4788]: I0219 09:04:46.728113 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ead14dd-01c7-49b3-9509-413bf05d77e6","Type":"ContainerStarted","Data":"9dbb28099dac1caa109bb90d69042d88d8e55a34e9e8c1ef986a8222a6bc61e0"} Feb 19 09:04:46 crc kubenswrapper[4788]: I0219 09:04:46.728373 4788 generic.go:334] "Generic (PLEG): container finished" podID="1b5fef27-0741-4f5a-9a12-fa6917cf16af" containerID="5ae3e1f3005b2aa94da7d59bcd79e83657d3225550134345aa9ace61da634ba0" exitCode=0 Feb 19 09:04:46 crc kubenswrapper[4788]: I0219 09:04:46.728393 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" event={"ID":"1b5fef27-0741-4f5a-9a12-fa6917cf16af","Type":"ContainerDied","Data":"5ae3e1f3005b2aa94da7d59bcd79e83657d3225550134345aa9ace61da634ba0"} Feb 19 09:04:47 crc kubenswrapper[4788]: I0219 09:04:47.038510 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 09:04:47 crc kubenswrapper[4788]: I0219 09:04:47.767524 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ead14dd-01c7-49b3-9509-413bf05d77e6","Type":"ContainerStarted","Data":"119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186"} Feb 19 09:04:47 crc kubenswrapper[4788]: I0219 09:04:47.767857 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ead14dd-01c7-49b3-9509-413bf05d77e6","Type":"ContainerStarted","Data":"095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c"} Feb 19 09:04:47 crc kubenswrapper[4788]: I0219 09:04:47.799882 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.79986083 podStartE2EDuration="2.79986083s" podCreationTimestamp="2026-02-19 09:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:47.789399161 +0000 UTC m=+1189.777410653" watchObservedRunningTime="2026-02-19 09:04:47.79986083 +0000 UTC m=+1189.787872302" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.201795 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.290294 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsm79\" (UniqueName: \"kubernetes.io/projected/1b5fef27-0741-4f5a-9a12-fa6917cf16af-kube-api-access-dsm79\") pod \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.290364 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-scripts\") pod \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.290457 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-combined-ca-bundle\") pod \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.290536 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-config-data\") pod \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\" (UID: \"1b5fef27-0741-4f5a-9a12-fa6917cf16af\") " Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.297066 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5fef27-0741-4f5a-9a12-fa6917cf16af-kube-api-access-dsm79" (OuterVolumeSpecName: "kube-api-access-dsm79") pod "1b5fef27-0741-4f5a-9a12-fa6917cf16af" (UID: "1b5fef27-0741-4f5a-9a12-fa6917cf16af"). InnerVolumeSpecName "kube-api-access-dsm79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.300503 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-scripts" (OuterVolumeSpecName: "scripts") pod "1b5fef27-0741-4f5a-9a12-fa6917cf16af" (UID: "1b5fef27-0741-4f5a-9a12-fa6917cf16af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.328319 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-config-data" (OuterVolumeSpecName: "config-data") pod "1b5fef27-0741-4f5a-9a12-fa6917cf16af" (UID: "1b5fef27-0741-4f5a-9a12-fa6917cf16af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.337475 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b5fef27-0741-4f5a-9a12-fa6917cf16af" (UID: "1b5fef27-0741-4f5a-9a12-fa6917cf16af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.393009 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsm79\" (UniqueName: \"kubernetes.io/projected/1b5fef27-0741-4f5a-9a12-fa6917cf16af-kube-api-access-dsm79\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.393601 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.393618 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.393631 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5fef27-0741-4f5a-9a12-fa6917cf16af-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.794563 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" event={"ID":"1b5fef27-0741-4f5a-9a12-fa6917cf16af","Type":"ContainerDied","Data":"487a0cec5f51c0751f108336962a43e60f499374dceb2aab093186b5274e7cdc"} Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.794698 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="487a0cec5f51c0751f108336962a43e60f499374dceb2aab093186b5274e7cdc" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.794597 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-9xcz8" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.871701 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 09:04:48 crc kubenswrapper[4788]: E0219 09:04:48.872147 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5fef27-0741-4f5a-9a12-fa6917cf16af" containerName="nova-cell1-conductor-db-sync" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.874326 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5fef27-0741-4f5a-9a12-fa6917cf16af" containerName="nova-cell1-conductor-db-sync" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.874732 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5fef27-0741-4f5a-9a12-fa6917cf16af" containerName="nova-cell1-conductor-db-sync" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.875409 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.877345 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.889209 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.906315 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mscsw\" (UniqueName: \"kubernetes.io/projected/e49d31ab-3c5e-4389-8cf4-798995b5880f-kube-api-access-mscsw\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.906549 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d31ab-3c5e-4389-8cf4-798995b5880f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:48 crc kubenswrapper[4788]: I0219 09:04:48.906579 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49d31ab-3c5e-4389-8cf4-798995b5880f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.008380 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d31ab-3c5e-4389-8cf4-798995b5880f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.008443 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49d31ab-3c5e-4389-8cf4-798995b5880f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.008512 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mscsw\" (UniqueName: \"kubernetes.io/projected/e49d31ab-3c5e-4389-8cf4-798995b5880f-kube-api-access-mscsw\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.013841 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e49d31ab-3c5e-4389-8cf4-798995b5880f-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.014674 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d31ab-3c5e-4389-8cf4-798995b5880f-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.030652 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mscsw\" (UniqueName: \"kubernetes.io/projected/e49d31ab-3c5e-4389-8cf4-798995b5880f-kube-api-access-mscsw\") pod \"nova-cell1-conductor-0\" (UID: \"e49d31ab-3c5e-4389-8cf4-798995b5880f\") " pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.205446 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.653096 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 19 09:04:49 crc kubenswrapper[4788]: I0219 09:04:49.808219 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e49d31ab-3c5e-4389-8cf4-798995b5880f","Type":"ContainerStarted","Data":"9c47f1d9cfa0064d868587add328ae0027d3c30faa7e191fec01c76466b60baf"} Feb 19 09:04:50 crc kubenswrapper[4788]: I0219 09:04:50.817726 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e49d31ab-3c5e-4389-8cf4-798995b5880f","Type":"ContainerStarted","Data":"9a0d47b83af1840cfc0d90a4a8decabd55e9ba705b9e6315f3637177b59de06e"} Feb 19 09:04:50 crc kubenswrapper[4788]: I0219 09:04:50.818015 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 19 09:04:50 crc kubenswrapper[4788]: I0219 09:04:50.842716 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.842686779 podStartE2EDuration="2.842686779s" podCreationTimestamp="2026-02-19 09:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:04:50.835403369 +0000 UTC m=+1192.823414851" watchObservedRunningTime="2026-02-19 09:04:50.842686779 +0000 UTC m=+1192.830698281" Feb 19 09:04:51 crc kubenswrapper[4788]: I0219 09:04:51.098130 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 09:04:51 crc kubenswrapper[4788]: I0219 09:04:51.098194 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 09:04:52 crc kubenswrapper[4788]: I0219 09:04:52.038576 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 09:04:52 crc kubenswrapper[4788]: I0219 09:04:52.081001 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 09:04:52 crc kubenswrapper[4788]: I0219 09:04:52.108491 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 09:04:52 crc kubenswrapper[4788]: I0219 09:04:52.108537 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 09:04:52 crc kubenswrapper[4788]: I0219 09:04:52.878566 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 09:04:56 crc kubenswrapper[4788]: I0219 09:04:56.135077 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 09:04:56 crc kubenswrapper[4788]: I0219 09:04:56.135744 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 09:04:57 crc kubenswrapper[4788]: I0219 09:04:57.217726 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 09:04:57 crc kubenswrapper[4788]: I0219 09:04:57.217701 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 19 09:04:59 crc kubenswrapper[4788]: I0219 09:04:58.999868 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 09:04:59 crc kubenswrapper[4788]: I0219 09:04:59.232964 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 19 09:05:01 crc kubenswrapper[4788]: I0219 09:05:01.104379 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 09:05:01 crc kubenswrapper[4788]: I0219 09:05:01.104751 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 09:05:01 crc kubenswrapper[4788]: I0219 09:05:01.110072 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 09:05:01 crc kubenswrapper[4788]: I0219 09:05:01.110131 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.387297 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.387647 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ff06e9e7-8b7d-42d9-b321-172ede793104" containerName="kube-state-metrics" containerID="cri-o://553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289" gracePeriod=30 Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.871163 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.926057 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbtcw\" (UniqueName: \"kubernetes.io/projected/ff06e9e7-8b7d-42d9-b321-172ede793104-kube-api-access-gbtcw\") pod \"ff06e9e7-8b7d-42d9-b321-172ede793104\" (UID: \"ff06e9e7-8b7d-42d9-b321-172ede793104\") " Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.938340 4788 generic.go:334] "Generic (PLEG): container finished" podID="ff06e9e7-8b7d-42d9-b321-172ede793104" containerID="553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289" exitCode=2 Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.938396 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ff06e9e7-8b7d-42d9-b321-172ede793104","Type":"ContainerDied","Data":"553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289"} Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.938427 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ff06e9e7-8b7d-42d9-b321-172ede793104","Type":"ContainerDied","Data":"82976de3db0829358404db5db709f36d828027a92aab9593161896223fd3a387"} Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.938447 4788 scope.go:117] "RemoveContainer" containerID="553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289" Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.938453 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 09:05:02 crc kubenswrapper[4788]: I0219 09:05:02.939994 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff06e9e7-8b7d-42d9-b321-172ede793104-kube-api-access-gbtcw" (OuterVolumeSpecName: "kube-api-access-gbtcw") pod "ff06e9e7-8b7d-42d9-b321-172ede793104" (UID: "ff06e9e7-8b7d-42d9-b321-172ede793104"). InnerVolumeSpecName "kube-api-access-gbtcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.007580 4788 scope.go:117] "RemoveContainer" containerID="553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289" Feb 19 09:05:03 crc kubenswrapper[4788]: E0219 09:05:03.009456 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289\": container with ID starting with 553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289 not found: ID does not exist" containerID="553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.009526 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289"} err="failed to get container status \"553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289\": rpc error: code = NotFound desc = could not find container \"553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289\": container with ID starting with 553ea7b1a4e57481939711607b0914267d8ae04153b8dcbf26d7c74f69e33289 not found: ID does not exist" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.028846 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbtcw\" (UniqueName: \"kubernetes.io/projected/ff06e9e7-8b7d-42d9-b321-172ede793104-kube-api-access-gbtcw\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.271494 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.282659 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.300112 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:05:03 crc kubenswrapper[4788]: E0219 09:05:03.300507 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff06e9e7-8b7d-42d9-b321-172ede793104" containerName="kube-state-metrics" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.300525 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff06e9e7-8b7d-42d9-b321-172ede793104" containerName="kube-state-metrics" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.300705 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff06e9e7-8b7d-42d9-b321-172ede793104" containerName="kube-state-metrics" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.301303 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.303755 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.304144 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.321780 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.336257 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.336318 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.336733 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.336838 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pg7m\" (UniqueName: \"kubernetes.io/projected/10e25de6-536d-4640-b29a-702c7d4ca706-kube-api-access-2pg7m\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.440007 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pg7m\" (UniqueName: \"kubernetes.io/projected/10e25de6-536d-4640-b29a-702c7d4ca706-kube-api-access-2pg7m\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.440160 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.440200 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.440372 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.445925 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.448005 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.457199 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10e25de6-536d-4640-b29a-702c7d4ca706-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.458014 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pg7m\" (UniqueName: \"kubernetes.io/projected/10e25de6-536d-4640-b29a-702c7d4ca706-kube-api-access-2pg7m\") pod \"kube-state-metrics-0\" (UID: \"10e25de6-536d-4640-b29a-702c7d4ca706\") " pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.618603 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.870170 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.948561 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-config-data\") pod \"f82971ff-0ab5-4ae4-8de7-73159394c022\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.948632 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-combined-ca-bundle\") pod \"f82971ff-0ab5-4ae4-8de7-73159394c022\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.948697 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll6d4\" (UniqueName: \"kubernetes.io/projected/f82971ff-0ab5-4ae4-8de7-73159394c022-kube-api-access-ll6d4\") pod \"f82971ff-0ab5-4ae4-8de7-73159394c022\" (UID: \"f82971ff-0ab5-4ae4-8de7-73159394c022\") " Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.950211 4788 generic.go:334] "Generic (PLEG): container finished" podID="f82971ff-0ab5-4ae4-8de7-73159394c022" containerID="6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae" exitCode=137 Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.950285 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.950303 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f82971ff-0ab5-4ae4-8de7-73159394c022","Type":"ContainerDied","Data":"6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae"} Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.950341 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f82971ff-0ab5-4ae4-8de7-73159394c022","Type":"ContainerDied","Data":"116ed3607d3142147804194b62001eef08b769a93895b202faa0d892ec70bbc2"} Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.950360 4788 scope.go:117] "RemoveContainer" containerID="6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.953844 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82971ff-0ab5-4ae4-8de7-73159394c022-kube-api-access-ll6d4" (OuterVolumeSpecName: "kube-api-access-ll6d4") pod "f82971ff-0ab5-4ae4-8de7-73159394c022" (UID: "f82971ff-0ab5-4ae4-8de7-73159394c022"). InnerVolumeSpecName "kube-api-access-ll6d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.972172 4788 scope.go:117] "RemoveContainer" containerID="6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae" Feb 19 09:05:03 crc kubenswrapper[4788]: E0219 09:05:03.972801 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae\": container with ID starting with 6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae not found: ID does not exist" containerID="6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.972841 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae"} err="failed to get container status \"6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae\": rpc error: code = NotFound desc = could not find container \"6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae\": container with ID starting with 6327ca3dad413227da0639e948a987b699e96f30ec0be1b4a71dcdee6251acae not found: ID does not exist" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.980125 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f82971ff-0ab5-4ae4-8de7-73159394c022" (UID: "f82971ff-0ab5-4ae4-8de7-73159394c022"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:03 crc kubenswrapper[4788]: I0219 09:05:03.986314 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-config-data" (OuterVolumeSpecName: "config-data") pod "f82971ff-0ab5-4ae4-8de7-73159394c022" (UID: "f82971ff-0ab5-4ae4-8de7-73159394c022"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.051319 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.051365 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82971ff-0ab5-4ae4-8de7-73159394c022-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.051384 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll6d4\" (UniqueName: \"kubernetes.io/projected/f82971ff-0ab5-4ae4-8de7-73159394c022-kube-api-access-ll6d4\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.109515 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.281323 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.291038 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.301892 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.302425 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="ceilometer-central-agent" containerID="cri-o://c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850" gracePeriod=30 Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.302449 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="proxy-httpd" containerID="cri-o://6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95" gracePeriod=30 Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.302511 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="sg-core" containerID="cri-o://aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c" gracePeriod=30 Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.302530 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="ceilometer-notification-agent" containerID="cri-o://53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0" gracePeriod=30 Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.315429 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:05:04 crc kubenswrapper[4788]: E0219 09:05:04.316032 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82971ff-0ab5-4ae4-8de7-73159394c022" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.316102 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82971ff-0ab5-4ae4-8de7-73159394c022" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.317484 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82971ff-0ab5-4ae4-8de7-73159394c022" containerName="nova-cell1-novncproxy-novncproxy" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.318202 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.321044 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.321398 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.321547 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.329458 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.356497 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.356796 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.356914 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.357094 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.357178 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwzn\" (UniqueName: \"kubernetes.io/projected/f812a94e-1c79-42f5-8caa-34cd8352999c-kube-api-access-fwwzn\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.458742 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.458807 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.458884 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.458948 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.458966 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwzn\" (UniqueName: \"kubernetes.io/projected/f812a94e-1c79-42f5-8caa-34cd8352999c-kube-api-access-fwwzn\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.463929 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.464057 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.465067 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.470824 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f812a94e-1c79-42f5-8caa-34cd8352999c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.481448 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwzn\" (UniqueName: \"kubernetes.io/projected/f812a94e-1c79-42f5-8caa-34cd8352999c-kube-api-access-fwwzn\") pod \"nova-cell1-novncproxy-0\" (UID: \"f812a94e-1c79-42f5-8caa-34cd8352999c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.638596 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.739335 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f82971ff-0ab5-4ae4-8de7-73159394c022" path="/var/lib/kubelet/pods/f82971ff-0ab5-4ae4-8de7-73159394c022/volumes" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.740171 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff06e9e7-8b7d-42d9-b321-172ede793104" path="/var/lib/kubelet/pods/ff06e9e7-8b7d-42d9-b321-172ede793104/volumes" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.985907 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"10e25de6-536d-4640-b29a-702c7d4ca706","Type":"ContainerStarted","Data":"364b53d84213924b0c20ad614d606dcb8270ad9f3e98ea5bb7da2d77d45a9107"} Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.986212 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"10e25de6-536d-4640-b29a-702c7d4ca706","Type":"ContainerStarted","Data":"9362ea41f4fc68862ccd9ab37133e73abdb983bbe429ec0c0ccc0811c09f24e4"} Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.986279 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.995435 4788 generic.go:334] "Generic (PLEG): container finished" podID="423bc761-9d6c-4518-9668-30dc36cdd536" containerID="6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95" exitCode=0 Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.995466 4788 generic.go:334] "Generic (PLEG): container finished" podID="423bc761-9d6c-4518-9668-30dc36cdd536" containerID="aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c" exitCode=2 Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.995473 4788 generic.go:334] "Generic (PLEG): container finished" podID="423bc761-9d6c-4518-9668-30dc36cdd536" containerID="c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850" exitCode=0 Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.995595 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerDied","Data":"6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95"} Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.995681 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerDied","Data":"aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c"} Feb 19 09:05:04 crc kubenswrapper[4788]: I0219 09:05:04.995698 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerDied","Data":"c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850"} Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.013471 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.507522928 podStartE2EDuration="2.013455697s" podCreationTimestamp="2026-02-19 09:05:03 +0000 UTC" firstStartedPulling="2026-02-19 09:05:04.114234664 +0000 UTC m=+1206.102246136" lastFinishedPulling="2026-02-19 09:05:04.620167413 +0000 UTC m=+1206.608178905" observedRunningTime="2026-02-19 09:05:05.003283711 +0000 UTC m=+1206.991295173" watchObservedRunningTime="2026-02-19 09:05:05.013455697 +0000 UTC m=+1207.001467169" Feb 19 09:05:05 crc kubenswrapper[4788]: W0219 09:05:05.095462 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf812a94e_1c79_42f5_8caa_34cd8352999c.slice/crio-827e5aacd998406a9dd4e97d0c42022bb671b410b4715a77fd7c4e4fb3209034 WatchSource:0}: Error finding container 827e5aacd998406a9dd4e97d0c42022bb671b410b4715a77fd7c4e4fb3209034: Status 404 returned error can't find the container with id 827e5aacd998406a9dd4e97d0c42022bb671b410b4715a77fd7c4e4fb3209034 Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.096007 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.712534 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.790541 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-log-httpd\") pod \"423bc761-9d6c-4518-9668-30dc36cdd536\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.790676 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-sg-core-conf-yaml\") pod \"423bc761-9d6c-4518-9668-30dc36cdd536\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.790802 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj62h\" (UniqueName: \"kubernetes.io/projected/423bc761-9d6c-4518-9668-30dc36cdd536-kube-api-access-kj62h\") pod \"423bc761-9d6c-4518-9668-30dc36cdd536\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.790844 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-scripts\") pod \"423bc761-9d6c-4518-9668-30dc36cdd536\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.790882 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-run-httpd\") pod \"423bc761-9d6c-4518-9668-30dc36cdd536\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.790910 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-combined-ca-bundle\") pod \"423bc761-9d6c-4518-9668-30dc36cdd536\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.790958 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-config-data\") pod \"423bc761-9d6c-4518-9668-30dc36cdd536\" (UID: \"423bc761-9d6c-4518-9668-30dc36cdd536\") " Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.791266 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "423bc761-9d6c-4518-9668-30dc36cdd536" (UID: "423bc761-9d6c-4518-9668-30dc36cdd536"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.791586 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "423bc761-9d6c-4518-9668-30dc36cdd536" (UID: "423bc761-9d6c-4518-9668-30dc36cdd536"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.791653 4788 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.795742 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/423bc761-9d6c-4518-9668-30dc36cdd536-kube-api-access-kj62h" (OuterVolumeSpecName: "kube-api-access-kj62h") pod "423bc761-9d6c-4518-9668-30dc36cdd536" (UID: "423bc761-9d6c-4518-9668-30dc36cdd536"). InnerVolumeSpecName "kube-api-access-kj62h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.800489 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-scripts" (OuterVolumeSpecName: "scripts") pod "423bc761-9d6c-4518-9668-30dc36cdd536" (UID: "423bc761-9d6c-4518-9668-30dc36cdd536"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.823225 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "423bc761-9d6c-4518-9668-30dc36cdd536" (UID: "423bc761-9d6c-4518-9668-30dc36cdd536"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.878626 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "423bc761-9d6c-4518-9668-30dc36cdd536" (UID: "423bc761-9d6c-4518-9668-30dc36cdd536"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.893863 4788 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.893917 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj62h\" (UniqueName: \"kubernetes.io/projected/423bc761-9d6c-4518-9668-30dc36cdd536-kube-api-access-kj62h\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.893933 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.893943 4788 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/423bc761-9d6c-4518-9668-30dc36cdd536-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.893953 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.916724 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-config-data" (OuterVolumeSpecName: "config-data") pod "423bc761-9d6c-4518-9668-30dc36cdd536" (UID: "423bc761-9d6c-4518-9668-30dc36cdd536"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:05 crc kubenswrapper[4788]: I0219 09:05:05.995948 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/423bc761-9d6c-4518-9668-30dc36cdd536-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.007339 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f812a94e-1c79-42f5-8caa-34cd8352999c","Type":"ContainerStarted","Data":"31e5deb741a6fcf3b716587e1f8a2298e3473f97c85fb418b1933c4d257bb0e4"} Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.007398 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f812a94e-1c79-42f5-8caa-34cd8352999c","Type":"ContainerStarted","Data":"827e5aacd998406a9dd4e97d0c42022bb671b410b4715a77fd7c4e4fb3209034"} Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.012041 4788 generic.go:334] "Generic (PLEG): container finished" podID="423bc761-9d6c-4518-9668-30dc36cdd536" containerID="53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0" exitCode=0 Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.012154 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.012732 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerDied","Data":"53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0"} Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.012772 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"423bc761-9d6c-4518-9668-30dc36cdd536","Type":"ContainerDied","Data":"effe47ae70c8e297510610953035ec6390f8b9c5d51174f6f6b7f547d12dbda9"} Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.012792 4788 scope.go:117] "RemoveContainer" containerID="6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.028771 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.028749266 podStartE2EDuration="2.028749266s" podCreationTimestamp="2026-02-19 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:05:06.023903749 +0000 UTC m=+1208.011915241" watchObservedRunningTime="2026-02-19 09:05:06.028749266 +0000 UTC m=+1208.016760748" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.048663 4788 scope.go:117] "RemoveContainer" containerID="aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.074870 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.083805 4788 scope.go:117] "RemoveContainer" containerID="53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.096792 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.109575 4788 scope.go:117] "RemoveContainer" containerID="c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.117140 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:06 crc kubenswrapper[4788]: E0219 09:05:06.117698 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="ceilometer-central-agent" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.117725 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="ceilometer-central-agent" Feb 19 09:05:06 crc kubenswrapper[4788]: E0219 09:05:06.117750 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="ceilometer-notification-agent" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.117758 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="ceilometer-notification-agent" Feb 19 09:05:06 crc kubenswrapper[4788]: E0219 09:05:06.117809 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="sg-core" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.117818 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="sg-core" Feb 19 09:05:06 crc kubenswrapper[4788]: E0219 09:05:06.117840 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="proxy-httpd" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.117847 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="proxy-httpd" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.118069 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="ceilometer-central-agent" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.118086 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="proxy-httpd" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.118105 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="ceilometer-notification-agent" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.118125 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" containerName="sg-core" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.120512 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.124389 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.125804 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.126436 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.128757 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.139977 4788 scope.go:117] "RemoveContainer" containerID="6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95" Feb 19 09:05:06 crc kubenswrapper[4788]: E0219 09:05:06.142822 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95\": container with ID starting with 6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95 not found: ID does not exist" containerID="6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.142872 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95"} err="failed to get container status \"6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95\": rpc error: code = NotFound desc = could not find container \"6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95\": container with ID starting with 6ec134eb19bb0a160910c4baceae7a9092e992f66841c2c3b7b5c05d36a00b95 not found: ID does not exist" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.142898 4788 scope.go:117] "RemoveContainer" containerID="aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.143072 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 09:05:06 crc kubenswrapper[4788]: E0219 09:05:06.143365 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c\": container with ID starting with aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c not found: ID does not exist" containerID="aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.143402 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c"} err="failed to get container status \"aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c\": rpc error: code = NotFound desc = could not find container \"aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c\": container with ID starting with aa4018bf80a439f6302ee325509da6f5cc0c7ccc33d15fd222bbac70b859a64c not found: ID does not exist" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.143424 4788 scope.go:117] "RemoveContainer" containerID="53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.143498 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.144403 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 09:05:06 crc kubenswrapper[4788]: E0219 09:05:06.144122 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0\": container with ID starting with 53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0 not found: ID does not exist" containerID="53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.144694 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0"} err="failed to get container status \"53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0\": rpc error: code = NotFound desc = could not find container \"53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0\": container with ID starting with 53e3bac873af514ea1f70e6a30b1a67a0a7b839dcacaf426e448e231a357b6f0 not found: ID does not exist" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.144781 4788 scope.go:117] "RemoveContainer" containerID="c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.146541 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 09:05:06 crc kubenswrapper[4788]: E0219 09:05:06.146914 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850\": container with ID starting with c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850 not found: ID does not exist" containerID="c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.146943 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850"} err="failed to get container status \"c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850\": rpc error: code = NotFound desc = could not find container \"c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850\": container with ID starting with c394e16c3103cd8701cae00c6a6a34941646daf960df669c54990584df5a9850 not found: ID does not exist" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.199267 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.199616 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-scripts\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.200063 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk7jg\" (UniqueName: \"kubernetes.io/projected/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-kube-api-access-zk7jg\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.200145 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-run-httpd\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.200230 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-log-httpd\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.200541 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.200591 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.200646 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-config-data\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.302471 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.302522 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-scripts\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.302600 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk7jg\" (UniqueName: \"kubernetes.io/projected/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-kube-api-access-zk7jg\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.302629 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-run-httpd\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.302653 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-log-httpd\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.302691 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.302710 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.302739 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-config-data\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.303882 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-log-httpd\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.303922 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-run-httpd\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.307059 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-scripts\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.307312 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.307554 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-config-data\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.308298 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.309834 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.323298 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk7jg\" (UniqueName: \"kubernetes.io/projected/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-kube-api-access-zk7jg\") pod \"ceilometer-0\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.445961 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.726589 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="423bc761-9d6c-4518-9668-30dc36cdd536" path="/var/lib/kubelet/pods/423bc761-9d6c-4518-9668-30dc36cdd536/volumes" Feb 19 09:05:06 crc kubenswrapper[4788]: I0219 09:05:06.967383 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:06 crc kubenswrapper[4788]: W0219 09:05:06.974948 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod342f5b09_d9bb_4b4b_8777_30dd3a8253ca.slice/crio-46074013e4e1449f671bc650328487f7a6c00650d2f38c301992540926225c2e WatchSource:0}: Error finding container 46074013e4e1449f671bc650328487f7a6c00650d2f38c301992540926225c2e: Status 404 returned error can't find the container with id 46074013e4e1449f671bc650328487f7a6c00650d2f38c301992540926225c2e Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.026120 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerStarted","Data":"46074013e4e1449f671bc650328487f7a6c00650d2f38c301992540926225c2e"} Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.026477 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.035266 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.225741 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4"] Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.227298 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.257173 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4"] Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.321321 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.321396 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lx8f\" (UniqueName: \"kubernetes.io/projected/bb86749e-3ca7-473a-88ee-26e930c57552-kube-api-access-8lx8f\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.321473 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.321528 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.321580 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-config\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.321605 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.422822 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.423181 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lx8f\" (UniqueName: \"kubernetes.io/projected/bb86749e-3ca7-473a-88ee-26e930c57552-kube-api-access-8lx8f\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.423237 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.423285 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.423347 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-config\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.423378 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.423840 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.424043 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.424469 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.424722 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.425353 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-config\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.440083 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lx8f\" (UniqueName: \"kubernetes.io/projected/bb86749e-3ca7-473a-88ee-26e930c57552-kube-api-access-8lx8f\") pod \"dnsmasq-dns-6b7bbf7cf9-qhlb4\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:07 crc kubenswrapper[4788]: I0219 09:05:07.570764 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:08 crc kubenswrapper[4788]: I0219 09:05:08.037304 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerStarted","Data":"da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92"} Feb 19 09:05:08 crc kubenswrapper[4788]: W0219 09:05:08.119458 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb86749e_3ca7_473a_88ee_26e930c57552.slice/crio-9b16c3ff2d226d58be4c39a6c9946d9e6ba26a34c1606b24b590146b41a2aa93 WatchSource:0}: Error finding container 9b16c3ff2d226d58be4c39a6c9946d9e6ba26a34c1606b24b590146b41a2aa93: Status 404 returned error can't find the container with id 9b16c3ff2d226d58be4c39a6c9946d9e6ba26a34c1606b24b590146b41a2aa93 Feb 19 09:05:08 crc kubenswrapper[4788]: I0219 09:05:08.129079 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4"] Feb 19 09:05:09 crc kubenswrapper[4788]: I0219 09:05:09.048182 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerStarted","Data":"753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d"} Feb 19 09:05:09 crc kubenswrapper[4788]: I0219 09:05:09.051216 4788 generic.go:334] "Generic (PLEG): container finished" podID="bb86749e-3ca7-473a-88ee-26e930c57552" containerID="670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12" exitCode=0 Feb 19 09:05:09 crc kubenswrapper[4788]: I0219 09:05:09.051280 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" event={"ID":"bb86749e-3ca7-473a-88ee-26e930c57552","Type":"ContainerDied","Data":"670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12"} Feb 19 09:05:09 crc kubenswrapper[4788]: I0219 09:05:09.051332 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" event={"ID":"bb86749e-3ca7-473a-88ee-26e930c57552","Type":"ContainerStarted","Data":"9b16c3ff2d226d58be4c39a6c9946d9e6ba26a34c1606b24b590146b41a2aa93"} Feb 19 09:05:09 crc kubenswrapper[4788]: I0219 09:05:09.236744 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:05:09 crc kubenswrapper[4788]: I0219 09:05:09.639915 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:09 crc kubenswrapper[4788]: I0219 09:05:09.761372 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:10 crc kubenswrapper[4788]: I0219 09:05:10.072456 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerStarted","Data":"0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728"} Feb 19 09:05:10 crc kubenswrapper[4788]: I0219 09:05:10.081909 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-log" containerID="cri-o://095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c" gracePeriod=30 Feb 19 09:05:10 crc kubenswrapper[4788]: I0219 09:05:10.082595 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" event={"ID":"bb86749e-3ca7-473a-88ee-26e930c57552","Type":"ContainerStarted","Data":"a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94"} Feb 19 09:05:10 crc kubenswrapper[4788]: I0219 09:05:10.083364 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:10 crc kubenswrapper[4788]: I0219 09:05:10.083472 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-api" containerID="cri-o://119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186" gracePeriod=30 Feb 19 09:05:10 crc kubenswrapper[4788]: I0219 09:05:10.122500 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" podStartSLOduration=3.1224769549999998 podStartE2EDuration="3.122476955s" podCreationTimestamp="2026-02-19 09:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:05:10.113781815 +0000 UTC m=+1212.101793287" watchObservedRunningTime="2026-02-19 09:05:10.122476955 +0000 UTC m=+1212.110488427" Feb 19 09:05:10 crc kubenswrapper[4788]: I0219 09:05:10.267528 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:11 crc kubenswrapper[4788]: I0219 09:05:11.105611 4788 generic.go:334] "Generic (PLEG): container finished" podID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerID="095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c" exitCode=143 Feb 19 09:05:11 crc kubenswrapper[4788]: I0219 09:05:11.105954 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ead14dd-01c7-49b3-9509-413bf05d77e6","Type":"ContainerDied","Data":"095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c"} Feb 19 09:05:12 crc kubenswrapper[4788]: I0219 09:05:12.121067 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerStarted","Data":"7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e"} Feb 19 09:05:12 crc kubenswrapper[4788]: I0219 09:05:12.121612 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="ceilometer-notification-agent" containerID="cri-o://753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d" gracePeriod=30 Feb 19 09:05:12 crc kubenswrapper[4788]: I0219 09:05:12.121619 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="sg-core" containerID="cri-o://0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728" gracePeriod=30 Feb 19 09:05:12 crc kubenswrapper[4788]: I0219 09:05:12.121639 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 09:05:12 crc kubenswrapper[4788]: I0219 09:05:12.121203 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="ceilometer-central-agent" containerID="cri-o://da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92" gracePeriod=30 Feb 19 09:05:12 crc kubenswrapper[4788]: I0219 09:05:12.121520 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="proxy-httpd" containerID="cri-o://7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e" gracePeriod=30 Feb 19 09:05:12 crc kubenswrapper[4788]: I0219 09:05:12.167119 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.240970252 podStartE2EDuration="6.167088756s" podCreationTimestamp="2026-02-19 09:05:06 +0000 UTC" firstStartedPulling="2026-02-19 09:05:06.978462591 +0000 UTC m=+1208.966474073" lastFinishedPulling="2026-02-19 09:05:10.904581105 +0000 UTC m=+1212.892592577" observedRunningTime="2026-02-19 09:05:12.15979824 +0000 UTC m=+1214.147809712" watchObservedRunningTime="2026-02-19 09:05:12.167088756 +0000 UTC m=+1214.155100218" Feb 19 09:05:13 crc kubenswrapper[4788]: I0219 09:05:13.132053 4788 generic.go:334] "Generic (PLEG): container finished" podID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerID="7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e" exitCode=0 Feb 19 09:05:13 crc kubenswrapper[4788]: I0219 09:05:13.132521 4788 generic.go:334] "Generic (PLEG): container finished" podID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerID="0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728" exitCode=2 Feb 19 09:05:13 crc kubenswrapper[4788]: I0219 09:05:13.132100 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerDied","Data":"7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e"} Feb 19 09:05:13 crc kubenswrapper[4788]: I0219 09:05:13.132616 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerDied","Data":"0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728"} Feb 19 09:05:13 crc kubenswrapper[4788]: I0219 09:05:13.132649 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerDied","Data":"753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d"} Feb 19 09:05:13 crc kubenswrapper[4788]: I0219 09:05:13.132560 4788 generic.go:334] "Generic (PLEG): container finished" podID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerID="753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d" exitCode=0 Feb 19 09:05:13 crc kubenswrapper[4788]: I0219 09:05:13.631234 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 19 09:05:13 crc kubenswrapper[4788]: I0219 09:05:13.864727 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.045596 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-config-data\") pod \"2ead14dd-01c7-49b3-9509-413bf05d77e6\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.045714 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-combined-ca-bundle\") pod \"2ead14dd-01c7-49b3-9509-413bf05d77e6\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.045753 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d2xh\" (UniqueName: \"kubernetes.io/projected/2ead14dd-01c7-49b3-9509-413bf05d77e6-kube-api-access-6d2xh\") pod \"2ead14dd-01c7-49b3-9509-413bf05d77e6\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.045843 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead14dd-01c7-49b3-9509-413bf05d77e6-logs\") pod \"2ead14dd-01c7-49b3-9509-413bf05d77e6\" (UID: \"2ead14dd-01c7-49b3-9509-413bf05d77e6\") " Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.046778 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ead14dd-01c7-49b3-9509-413bf05d77e6-logs" (OuterVolumeSpecName: "logs") pod "2ead14dd-01c7-49b3-9509-413bf05d77e6" (UID: "2ead14dd-01c7-49b3-9509-413bf05d77e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.052508 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ead14dd-01c7-49b3-9509-413bf05d77e6-kube-api-access-6d2xh" (OuterVolumeSpecName: "kube-api-access-6d2xh") pod "2ead14dd-01c7-49b3-9509-413bf05d77e6" (UID: "2ead14dd-01c7-49b3-9509-413bf05d77e6"). InnerVolumeSpecName "kube-api-access-6d2xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.081057 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-config-data" (OuterVolumeSpecName: "config-data") pod "2ead14dd-01c7-49b3-9509-413bf05d77e6" (UID: "2ead14dd-01c7-49b3-9509-413bf05d77e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.091863 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ead14dd-01c7-49b3-9509-413bf05d77e6" (UID: "2ead14dd-01c7-49b3-9509-413bf05d77e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.142840 4788 generic.go:334] "Generic (PLEG): container finished" podID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerID="119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186" exitCode=0 Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.142888 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.142893 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ead14dd-01c7-49b3-9509-413bf05d77e6","Type":"ContainerDied","Data":"119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186"} Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.143043 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2ead14dd-01c7-49b3-9509-413bf05d77e6","Type":"ContainerDied","Data":"9dbb28099dac1caa109bb90d69042d88d8e55a34e9e8c1ef986a8222a6bc61e0"} Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.143076 4788 scope.go:117] "RemoveContainer" containerID="119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.156208 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.156564 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ead14dd-01c7-49b3-9509-413bf05d77e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.156576 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d2xh\" (UniqueName: \"kubernetes.io/projected/2ead14dd-01c7-49b3-9509-413bf05d77e6-kube-api-access-6d2xh\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.156587 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ead14dd-01c7-49b3-9509-413bf05d77e6-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.178841 4788 scope.go:117] "RemoveContainer" containerID="095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.205488 4788 scope.go:117] "RemoveContainer" containerID="119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186" Feb 19 09:05:14 crc kubenswrapper[4788]: E0219 09:05:14.205848 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186\": container with ID starting with 119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186 not found: ID does not exist" containerID="119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.205877 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186"} err="failed to get container status \"119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186\": rpc error: code = NotFound desc = could not find container \"119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186\": container with ID starting with 119c608b88a82ac49cf16f4f2e60433d99831d16db5759a84b66f2d37eb2a186 not found: ID does not exist" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.205898 4788 scope.go:117] "RemoveContainer" containerID="095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.206104 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:14 crc kubenswrapper[4788]: E0219 09:05:14.206165 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c\": container with ID starting with 095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c not found: ID does not exist" containerID="095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.206185 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c"} err="failed to get container status \"095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c\": rpc error: code = NotFound desc = could not find container \"095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c\": container with ID starting with 095912e4ee16110ad0077281aa105832d2100889ce99da86a006e2a5cc06fd8c not found: ID does not exist" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.222423 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.238809 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:14 crc kubenswrapper[4788]: E0219 09:05:14.239309 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-log" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.239326 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-log" Feb 19 09:05:14 crc kubenswrapper[4788]: E0219 09:05:14.239376 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-api" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.239383 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-api" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.239552 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-api" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.239576 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" containerName="nova-api-log" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.240603 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.242774 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.243172 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.243315 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.282098 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.370393 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q95n5\" (UniqueName: \"kubernetes.io/projected/fc72f731-1258-4007-8119-30ad6b1837ef-kube-api-access-q95n5\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.370462 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc72f731-1258-4007-8119-30ad6b1837ef-logs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.370747 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.370885 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.370966 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-public-tls-certs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.371023 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-config-data\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.472996 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q95n5\" (UniqueName: \"kubernetes.io/projected/fc72f731-1258-4007-8119-30ad6b1837ef-kube-api-access-q95n5\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.473047 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc72f731-1258-4007-8119-30ad6b1837ef-logs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.473116 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.473152 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.473198 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-public-tls-certs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.473236 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-config-data\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.473646 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc72f731-1258-4007-8119-30ad6b1837ef-logs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.477518 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-public-tls-certs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.479062 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-config-data\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.479734 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.485449 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.490493 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q95n5\" (UniqueName: \"kubernetes.io/projected/fc72f731-1258-4007-8119-30ad6b1837ef-kube-api-access-q95n5\") pod \"nova-api-0\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.562626 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.654767 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.680469 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.730118 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ead14dd-01c7-49b3-9509-413bf05d77e6" path="/var/lib/kubelet/pods/2ead14dd-01c7-49b3-9509-413bf05d77e6/volumes" Feb 19 09:05:14 crc kubenswrapper[4788]: I0219 09:05:14.932187 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.084289 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-sg-core-conf-yaml\") pod \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.084385 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-combined-ca-bundle\") pod \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.084453 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk7jg\" (UniqueName: \"kubernetes.io/projected/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-kube-api-access-zk7jg\") pod \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.084487 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-log-httpd\") pod \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.084516 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-config-data\") pod \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.084541 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-scripts\") pod \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.084605 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-run-httpd\") pod \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.084633 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-ceilometer-tls-certs\") pod \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\" (UID: \"342f5b09-d9bb-4b4b-8777-30dd3a8253ca\") " Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.085176 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "342f5b09-d9bb-4b4b-8777-30dd3a8253ca" (UID: "342f5b09-d9bb-4b4b-8777-30dd3a8253ca"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.085589 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "342f5b09-d9bb-4b4b-8777-30dd3a8253ca" (UID: "342f5b09-d9bb-4b4b-8777-30dd3a8253ca"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.090978 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-scripts" (OuterVolumeSpecName: "scripts") pod "342f5b09-d9bb-4b4b-8777-30dd3a8253ca" (UID: "342f5b09-d9bb-4b4b-8777-30dd3a8253ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.093370 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-kube-api-access-zk7jg" (OuterVolumeSpecName: "kube-api-access-zk7jg") pod "342f5b09-d9bb-4b4b-8777-30dd3a8253ca" (UID: "342f5b09-d9bb-4b4b-8777-30dd3a8253ca"). InnerVolumeSpecName "kube-api-access-zk7jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.129525 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "342f5b09-d9bb-4b4b-8777-30dd3a8253ca" (UID: "342f5b09-d9bb-4b4b-8777-30dd3a8253ca"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.155635 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.161670 4788 generic.go:334] "Generic (PLEG): container finished" podID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerID="da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92" exitCode=0 Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.161722 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.161751 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerDied","Data":"da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92"} Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.161778 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"342f5b09-d9bb-4b4b-8777-30dd3a8253ca","Type":"ContainerDied","Data":"46074013e4e1449f671bc650328487f7a6c00650d2f38c301992540926225c2e"} Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.161795 4788 scope.go:117] "RemoveContainer" containerID="7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e" Feb 19 09:05:15 crc kubenswrapper[4788]: W0219 09:05:15.169134 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc72f731_1258_4007_8119_30ad6b1837ef.slice/crio-a6b0d440baad02e7ced2cc2b031ad7d37ff691fd0b09c8e341217c79109b6f1c WatchSource:0}: Error finding container a6b0d440baad02e7ced2cc2b031ad7d37ff691fd0b09c8e341217c79109b6f1c: Status 404 returned error can't find the container with id a6b0d440baad02e7ced2cc2b031ad7d37ff691fd0b09c8e341217c79109b6f1c Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.171138 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "342f5b09-d9bb-4b4b-8777-30dd3a8253ca" (UID: "342f5b09-d9bb-4b4b-8777-30dd3a8253ca"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.177521 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "342f5b09-d9bb-4b4b-8777-30dd3a8253ca" (UID: "342f5b09-d9bb-4b4b-8777-30dd3a8253ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.184116 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.185440 4788 scope.go:117] "RemoveContainer" containerID="0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.187516 4788 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.187545 4788 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.187555 4788 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.187564 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.187573 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk7jg\" (UniqueName: \"kubernetes.io/projected/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-kube-api-access-zk7jg\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.187582 4788 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.187590 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.222618 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-config-data" (OuterVolumeSpecName: "config-data") pod "342f5b09-d9bb-4b4b-8777-30dd3a8253ca" (UID: "342f5b09-d9bb-4b4b-8777-30dd3a8253ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.231669 4788 scope.go:117] "RemoveContainer" containerID="753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.264572 4788 scope.go:117] "RemoveContainer" containerID="da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.288837 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/342f5b09-d9bb-4b4b-8777-30dd3a8253ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.289901 4788 scope.go:117] "RemoveContainer" containerID="7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e" Feb 19 09:05:15 crc kubenswrapper[4788]: E0219 09:05:15.290475 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e\": container with ID starting with 7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e not found: ID does not exist" containerID="7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.290511 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e"} err="failed to get container status \"7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e\": rpc error: code = NotFound desc = could not find container \"7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e\": container with ID starting with 7c56d554fc5cfd288cd2e14e55bb3fa7b709c3bbec4825bf8011b6ca6acf158e not found: ID does not exist" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.290534 4788 scope.go:117] "RemoveContainer" containerID="0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728" Feb 19 09:05:15 crc kubenswrapper[4788]: E0219 09:05:15.290824 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728\": container with ID starting with 0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728 not found: ID does not exist" containerID="0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.290884 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728"} err="failed to get container status \"0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728\": rpc error: code = NotFound desc = could not find container \"0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728\": container with ID starting with 0df2bcc667e58d83353ad3644700803b996e395c1bcc83ad39f79c230b7e4728 not found: ID does not exist" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.290899 4788 scope.go:117] "RemoveContainer" containerID="753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d" Feb 19 09:05:15 crc kubenswrapper[4788]: E0219 09:05:15.291194 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d\": container with ID starting with 753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d not found: ID does not exist" containerID="753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.291230 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d"} err="failed to get container status \"753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d\": rpc error: code = NotFound desc = could not find container \"753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d\": container with ID starting with 753505c37bf75419a47db17ab492810061bb76830d4211d5ddf431d9b468ae0d not found: ID does not exist" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.291274 4788 scope.go:117] "RemoveContainer" containerID="da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92" Feb 19 09:05:15 crc kubenswrapper[4788]: E0219 09:05:15.291520 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92\": container with ID starting with da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92 not found: ID does not exist" containerID="da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.291543 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92"} err="failed to get container status \"da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92\": rpc error: code = NotFound desc = could not find container \"da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92\": container with ID starting with da49ce14e11b2720ce62e6771e3f9cc105b088b5eac3143bb54109c89de4fb92 not found: ID does not exist" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365081 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-kkt72"] Feb 19 09:05:15 crc kubenswrapper[4788]: E0219 09:05:15.365509 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="sg-core" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365528 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="sg-core" Feb 19 09:05:15 crc kubenswrapper[4788]: E0219 09:05:15.365554 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="ceilometer-notification-agent" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365561 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="ceilometer-notification-agent" Feb 19 09:05:15 crc kubenswrapper[4788]: E0219 09:05:15.365576 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="ceilometer-central-agent" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365583 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="ceilometer-central-agent" Feb 19 09:05:15 crc kubenswrapper[4788]: E0219 09:05:15.365595 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="proxy-httpd" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365600 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="proxy-httpd" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365779 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="ceilometer-central-agent" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365798 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="sg-core" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365814 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="ceilometer-notification-agent" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.365830 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" containerName="proxy-httpd" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.366492 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.369702 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.376705 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.401313 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-kkt72"] Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.498432 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-scripts\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.498833 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.498877 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcxhm\" (UniqueName: \"kubernetes.io/projected/88c8f267-83eb-4e22-9e99-78c3dc096823-kube-api-access-mcxhm\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.498911 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-config-data\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.600578 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-scripts\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.600730 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.600749 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcxhm\" (UniqueName: \"kubernetes.io/projected/88c8f267-83eb-4e22-9e99-78c3dc096823-kube-api-access-mcxhm\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.600770 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-config-data\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.605316 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.605986 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-config-data\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.606683 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.623050 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.629506 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-scripts\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.634061 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcxhm\" (UniqueName: \"kubernetes.io/projected/88c8f267-83eb-4e22-9e99-78c3dc096823-kube-api-access-mcxhm\") pod \"nova-cell1-cell-mapping-kkt72\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.638035 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.640347 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.644875 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.645014 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.644887 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.651296 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.681703 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.808503 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqwf\" (UniqueName: \"kubernetes.io/projected/0c273f46-7fd9-4269-98a8-1df269a9a915-kube-api-access-qcqwf\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.808747 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-config-data\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.808789 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-scripts\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.808827 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.808850 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.808942 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.808989 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c273f46-7fd9-4269-98a8-1df269a9a915-run-httpd\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.809077 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c273f46-7fd9-4269-98a8-1df269a9a915-log-httpd\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.910472 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqwf\" (UniqueName: \"kubernetes.io/projected/0c273f46-7fd9-4269-98a8-1df269a9a915-kube-api-access-qcqwf\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.910516 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-config-data\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.910545 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-scripts\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.910579 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.910603 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.910682 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.910723 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c273f46-7fd9-4269-98a8-1df269a9a915-run-httpd\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.910751 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c273f46-7fd9-4269-98a8-1df269a9a915-log-httpd\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.911320 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c273f46-7fd9-4269-98a8-1df269a9a915-log-httpd\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.914047 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c273f46-7fd9-4269-98a8-1df269a9a915-run-httpd\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.915831 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.915922 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.916369 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-config-data\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.920070 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.921972 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c273f46-7fd9-4269-98a8-1df269a9a915-scripts\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:15 crc kubenswrapper[4788]: I0219 09:05:15.930932 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqwf\" (UniqueName: \"kubernetes.io/projected/0c273f46-7fd9-4269-98a8-1df269a9a915-kube-api-access-qcqwf\") pod \"ceilometer-0\" (UID: \"0c273f46-7fd9-4269-98a8-1df269a9a915\") " pod="openstack/ceilometer-0" Feb 19 09:05:16 crc kubenswrapper[4788]: I0219 09:05:16.011003 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 19 09:05:16 crc kubenswrapper[4788]: I0219 09:05:16.163183 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-kkt72"] Feb 19 09:05:16 crc kubenswrapper[4788]: W0219 09:05:16.167512 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88c8f267_83eb_4e22_9e99_78c3dc096823.slice/crio-b2da2817fe48ed05e40229f654aaa6a3301fb891b3fe581c31e39dc55503b0a8 WatchSource:0}: Error finding container b2da2817fe48ed05e40229f654aaa6a3301fb891b3fe581c31e39dc55503b0a8: Status 404 returned error can't find the container with id b2da2817fe48ed05e40229f654aaa6a3301fb891b3fe581c31e39dc55503b0a8 Feb 19 09:05:16 crc kubenswrapper[4788]: I0219 09:05:16.176176 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc72f731-1258-4007-8119-30ad6b1837ef","Type":"ContainerStarted","Data":"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802"} Feb 19 09:05:16 crc kubenswrapper[4788]: I0219 09:05:16.176213 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc72f731-1258-4007-8119-30ad6b1837ef","Type":"ContainerStarted","Data":"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce"} Feb 19 09:05:16 crc kubenswrapper[4788]: I0219 09:05:16.176223 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc72f731-1258-4007-8119-30ad6b1837ef","Type":"ContainerStarted","Data":"a6b0d440baad02e7ced2cc2b031ad7d37ff691fd0b09c8e341217c79109b6f1c"} Feb 19 09:05:16 crc kubenswrapper[4788]: I0219 09:05:16.202062 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.202039803 podStartE2EDuration="2.202039803s" podCreationTimestamp="2026-02-19 09:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:05:16.194039789 +0000 UTC m=+1218.182051261" watchObservedRunningTime="2026-02-19 09:05:16.202039803 +0000 UTC m=+1218.190051275" Feb 19 09:05:16 crc kubenswrapper[4788]: I0219 09:05:16.447862 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 19 09:05:16 crc kubenswrapper[4788]: W0219 09:05:16.449776 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c273f46_7fd9_4269_98a8_1df269a9a915.slice/crio-686c16d089343d8e406b78ec7f5e743f6691dea6705d0da1f3cc7f7076194921 WatchSource:0}: Error finding container 686c16d089343d8e406b78ec7f5e743f6691dea6705d0da1f3cc7f7076194921: Status 404 returned error can't find the container with id 686c16d089343d8e406b78ec7f5e743f6691dea6705d0da1f3cc7f7076194921 Feb 19 09:05:16 crc kubenswrapper[4788]: I0219 09:05:16.727068 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="342f5b09-d9bb-4b4b-8777-30dd3a8253ca" path="/var/lib/kubelet/pods/342f5b09-d9bb-4b4b-8777-30dd3a8253ca/volumes" Feb 19 09:05:17 crc kubenswrapper[4788]: I0219 09:05:17.188235 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c273f46-7fd9-4269-98a8-1df269a9a915","Type":"ContainerStarted","Data":"8c2feb9d1d3670217017ac4ec4ac95f716cd59251a654f09d2764d45e911d181"} Feb 19 09:05:17 crc kubenswrapper[4788]: I0219 09:05:17.188334 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c273f46-7fd9-4269-98a8-1df269a9a915","Type":"ContainerStarted","Data":"686c16d089343d8e406b78ec7f5e743f6691dea6705d0da1f3cc7f7076194921"} Feb 19 09:05:17 crc kubenswrapper[4788]: I0219 09:05:17.190463 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kkt72" event={"ID":"88c8f267-83eb-4e22-9e99-78c3dc096823","Type":"ContainerStarted","Data":"21d342bbc201906aca6674b1f66d888bdcd3ff4baec7beab73679a1108af5082"} Feb 19 09:05:17 crc kubenswrapper[4788]: I0219 09:05:17.190903 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kkt72" event={"ID":"88c8f267-83eb-4e22-9e99-78c3dc096823","Type":"ContainerStarted","Data":"b2da2817fe48ed05e40229f654aaa6a3301fb891b3fe581c31e39dc55503b0a8"} Feb 19 09:05:17 crc kubenswrapper[4788]: I0219 09:05:17.572269 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:05:17 crc kubenswrapper[4788]: I0219 09:05:17.615749 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-kkt72" podStartSLOduration=2.61572745 podStartE2EDuration="2.61572745s" podCreationTimestamp="2026-02-19 09:05:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:05:17.212622909 +0000 UTC m=+1219.200634381" watchObservedRunningTime="2026-02-19 09:05:17.61572745 +0000 UTC m=+1219.603738922" Feb 19 09:05:17 crc kubenswrapper[4788]: I0219 09:05:17.651343 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-8z8l9"] Feb 19 09:05:17 crc kubenswrapper[4788]: I0219 09:05:17.651822 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" podUID="e055219d-2144-4750-8255-9bc573b74163" containerName="dnsmasq-dns" containerID="cri-o://ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c" gracePeriod=10 Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.121326 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.200322 4788 generic.go:334] "Generic (PLEG): container finished" podID="e055219d-2144-4750-8255-9bc573b74163" containerID="ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c" exitCode=0 Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.200379 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.200402 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" event={"ID":"e055219d-2144-4750-8255-9bc573b74163","Type":"ContainerDied","Data":"ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c"} Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.200449 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-8z8l9" event={"ID":"e055219d-2144-4750-8255-9bc573b74163","Type":"ContainerDied","Data":"864f3c952385a8b96978e7fb03fb8e947385c5e648bc7050a81c3877837aaea2"} Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.200470 4788 scope.go:117] "RemoveContainer" containerID="ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.202139 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c273f46-7fd9-4269-98a8-1df269a9a915","Type":"ContainerStarted","Data":"17b2203d37d44f7e2781efe147cd416639e9b30848cc815e7fcdddc6654d6ce0"} Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.226646 4788 scope.go:117] "RemoveContainer" containerID="85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.246240 4788 scope.go:117] "RemoveContainer" containerID="ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c" Feb 19 09:05:18 crc kubenswrapper[4788]: E0219 09:05:18.246853 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c\": container with ID starting with ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c not found: ID does not exist" containerID="ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.246899 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c"} err="failed to get container status \"ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c\": rpc error: code = NotFound desc = could not find container \"ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c\": container with ID starting with ead09866b037ed643b75c815661c1ff227e107fd58cc3806e77ddbeece9c8e0c not found: ID does not exist" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.246927 4788 scope.go:117] "RemoveContainer" containerID="85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0" Feb 19 09:05:18 crc kubenswrapper[4788]: E0219 09:05:18.247354 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0\": container with ID starting with 85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0 not found: ID does not exist" containerID="85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.247377 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0"} err="failed to get container status \"85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0\": rpc error: code = NotFound desc = could not find container \"85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0\": container with ID starting with 85566eeb878aef018b35fbdb094a4c3e4296ca5bfcdb24d34df15193e37f82f0 not found: ID does not exist" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.271212 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-svc\") pod \"e055219d-2144-4750-8255-9bc573b74163\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.271297 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvvtr\" (UniqueName: \"kubernetes.io/projected/e055219d-2144-4750-8255-9bc573b74163-kube-api-access-hvvtr\") pod \"e055219d-2144-4750-8255-9bc573b74163\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.271367 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-sb\") pod \"e055219d-2144-4750-8255-9bc573b74163\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.271390 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-swift-storage-0\") pod \"e055219d-2144-4750-8255-9bc573b74163\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.271414 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-nb\") pod \"e055219d-2144-4750-8255-9bc573b74163\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.271434 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-config\") pod \"e055219d-2144-4750-8255-9bc573b74163\" (UID: \"e055219d-2144-4750-8255-9bc573b74163\") " Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.277234 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e055219d-2144-4750-8255-9bc573b74163-kube-api-access-hvvtr" (OuterVolumeSpecName: "kube-api-access-hvvtr") pod "e055219d-2144-4750-8255-9bc573b74163" (UID: "e055219d-2144-4750-8255-9bc573b74163"). InnerVolumeSpecName "kube-api-access-hvvtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.320365 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e055219d-2144-4750-8255-9bc573b74163" (UID: "e055219d-2144-4750-8255-9bc573b74163"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.323929 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e055219d-2144-4750-8255-9bc573b74163" (UID: "e055219d-2144-4750-8255-9bc573b74163"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.324952 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e055219d-2144-4750-8255-9bc573b74163" (UID: "e055219d-2144-4750-8255-9bc573b74163"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.332091 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e055219d-2144-4750-8255-9bc573b74163" (UID: "e055219d-2144-4750-8255-9bc573b74163"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.335675 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-config" (OuterVolumeSpecName: "config") pod "e055219d-2144-4750-8255-9bc573b74163" (UID: "e055219d-2144-4750-8255-9bc573b74163"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.373616 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.373684 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvvtr\" (UniqueName: \"kubernetes.io/projected/e055219d-2144-4750-8255-9bc573b74163-kube-api-access-hvvtr\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.373700 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.373712 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.373792 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.373915 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e055219d-2144-4750-8255-9bc573b74163-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.545595 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-8z8l9"] Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.557692 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-8z8l9"] Feb 19 09:05:18 crc kubenswrapper[4788]: I0219 09:05:18.734358 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e055219d-2144-4750-8255-9bc573b74163" path="/var/lib/kubelet/pods/e055219d-2144-4750-8255-9bc573b74163/volumes" Feb 19 09:05:19 crc kubenswrapper[4788]: I0219 09:05:19.220853 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c273f46-7fd9-4269-98a8-1df269a9a915","Type":"ContainerStarted","Data":"35187f9cd3d343ce15fe86b49c7d3c8a333a551dadbe38f7b3007be18e0ba1c4"} Feb 19 09:05:21 crc kubenswrapper[4788]: I0219 09:05:21.251120 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c273f46-7fd9-4269-98a8-1df269a9a915","Type":"ContainerStarted","Data":"323ae67ba718f2a0e700dfb68c39fd958f80c328e6ff2ad84e5d1e390c37cade"} Feb 19 09:05:21 crc kubenswrapper[4788]: I0219 09:05:21.251717 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 19 09:05:21 crc kubenswrapper[4788]: I0219 09:05:21.284533 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.037192244 podStartE2EDuration="6.284516569s" podCreationTimestamp="2026-02-19 09:05:15 +0000 UTC" firstStartedPulling="2026-02-19 09:05:16.451447735 +0000 UTC m=+1218.439459207" lastFinishedPulling="2026-02-19 09:05:20.69877205 +0000 UTC m=+1222.686783532" observedRunningTime="2026-02-19 09:05:21.273896513 +0000 UTC m=+1223.261907985" watchObservedRunningTime="2026-02-19 09:05:21.284516569 +0000 UTC m=+1223.272528041" Feb 19 09:05:21 crc kubenswrapper[4788]: E0219 09:05:21.443943 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88c8f267_83eb_4e22_9e99_78c3dc096823.slice/crio-conmon-21d342bbc201906aca6674b1f66d888bdcd3ff4baec7beab73679a1108af5082.scope\": RecentStats: unable to find data in memory cache]" Feb 19 09:05:22 crc kubenswrapper[4788]: I0219 09:05:22.268765 4788 generic.go:334] "Generic (PLEG): container finished" podID="88c8f267-83eb-4e22-9e99-78c3dc096823" containerID="21d342bbc201906aca6674b1f66d888bdcd3ff4baec7beab73679a1108af5082" exitCode=0 Feb 19 09:05:22 crc kubenswrapper[4788]: I0219 09:05:22.268826 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kkt72" event={"ID":"88c8f267-83eb-4e22-9e99-78c3dc096823","Type":"ContainerDied","Data":"21d342bbc201906aca6674b1f66d888bdcd3ff4baec7beab73679a1108af5082"} Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.683598 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.776172 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-config-data\") pod \"88c8f267-83eb-4e22-9e99-78c3dc096823\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.777736 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-scripts\") pod \"88c8f267-83eb-4e22-9e99-78c3dc096823\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.777966 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcxhm\" (UniqueName: \"kubernetes.io/projected/88c8f267-83eb-4e22-9e99-78c3dc096823-kube-api-access-mcxhm\") pod \"88c8f267-83eb-4e22-9e99-78c3dc096823\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.778013 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-combined-ca-bundle\") pod \"88c8f267-83eb-4e22-9e99-78c3dc096823\" (UID: \"88c8f267-83eb-4e22-9e99-78c3dc096823\") " Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.784802 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c8f267-83eb-4e22-9e99-78c3dc096823-kube-api-access-mcxhm" (OuterVolumeSpecName: "kube-api-access-mcxhm") pod "88c8f267-83eb-4e22-9e99-78c3dc096823" (UID: "88c8f267-83eb-4e22-9e99-78c3dc096823"). InnerVolumeSpecName "kube-api-access-mcxhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.785458 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-scripts" (OuterVolumeSpecName: "scripts") pod "88c8f267-83eb-4e22-9e99-78c3dc096823" (UID: "88c8f267-83eb-4e22-9e99-78c3dc096823"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.807987 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-config-data" (OuterVolumeSpecName: "config-data") pod "88c8f267-83eb-4e22-9e99-78c3dc096823" (UID: "88c8f267-83eb-4e22-9e99-78c3dc096823"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.811296 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88c8f267-83eb-4e22-9e99-78c3dc096823" (UID: "88c8f267-83eb-4e22-9e99-78c3dc096823"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.881437 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcxhm\" (UniqueName: \"kubernetes.io/projected/88c8f267-83eb-4e22-9e99-78c3dc096823-kube-api-access-mcxhm\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.881479 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.881491 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:23 crc kubenswrapper[4788]: I0219 09:05:23.881503 4788 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88c8f267-83eb-4e22-9e99-78c3dc096823-scripts\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.302134 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kkt72" event={"ID":"88c8f267-83eb-4e22-9e99-78c3dc096823","Type":"ContainerDied","Data":"b2da2817fe48ed05e40229f654aaa6a3301fb891b3fe581c31e39dc55503b0a8"} Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.302772 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2da2817fe48ed05e40229f654aaa6a3301fb891b3fe581c31e39dc55503b0a8" Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.302213 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kkt72" Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.495041 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.495368 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="cb45b82b-6e45-4780-a2e7-99dcddc48974" containerName="nova-scheduler-scheduler" containerID="cri-o://c77958088275cc498f43c2050efd4a95057a16e5178fb4c4224d8ac2ed6ee44d" gracePeriod=30 Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.512072 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.512358 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" containerName="nova-api-log" containerID="cri-o://1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce" gracePeriod=30 Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.512498 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" containerName="nova-api-api" containerID="cri-o://b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802" gracePeriod=30 Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.540236 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.540680 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-log" containerID="cri-o://697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084" gracePeriod=30 Feb 19 09:05:24 crc kubenswrapper[4788]: I0219 09:05:24.540688 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-metadata" containerID="cri-o://37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc" gracePeriod=30 Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.105809 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.204031 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-internal-tls-certs\") pod \"fc72f731-1258-4007-8119-30ad6b1837ef\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.204230 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-public-tls-certs\") pod \"fc72f731-1258-4007-8119-30ad6b1837ef\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.204851 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc72f731-1258-4007-8119-30ad6b1837ef-logs\") pod \"fc72f731-1258-4007-8119-30ad6b1837ef\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.204934 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-config-data\") pod \"fc72f731-1258-4007-8119-30ad6b1837ef\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.204958 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q95n5\" (UniqueName: \"kubernetes.io/projected/fc72f731-1258-4007-8119-30ad6b1837ef-kube-api-access-q95n5\") pod \"fc72f731-1258-4007-8119-30ad6b1837ef\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.205006 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-combined-ca-bundle\") pod \"fc72f731-1258-4007-8119-30ad6b1837ef\" (UID: \"fc72f731-1258-4007-8119-30ad6b1837ef\") " Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.205157 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc72f731-1258-4007-8119-30ad6b1837ef-logs" (OuterVolumeSpecName: "logs") pod "fc72f731-1258-4007-8119-30ad6b1837ef" (UID: "fc72f731-1258-4007-8119-30ad6b1837ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.205745 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc72f731-1258-4007-8119-30ad6b1837ef-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.210795 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc72f731-1258-4007-8119-30ad6b1837ef-kube-api-access-q95n5" (OuterVolumeSpecName: "kube-api-access-q95n5") pod "fc72f731-1258-4007-8119-30ad6b1837ef" (UID: "fc72f731-1258-4007-8119-30ad6b1837ef"). InnerVolumeSpecName "kube-api-access-q95n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.237213 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-config-data" (OuterVolumeSpecName: "config-data") pod "fc72f731-1258-4007-8119-30ad6b1837ef" (UID: "fc72f731-1258-4007-8119-30ad6b1837ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.241019 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc72f731-1258-4007-8119-30ad6b1837ef" (UID: "fc72f731-1258-4007-8119-30ad6b1837ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.256347 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc72f731-1258-4007-8119-30ad6b1837ef" (UID: "fc72f731-1258-4007-8119-30ad6b1837ef"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.256746 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fc72f731-1258-4007-8119-30ad6b1837ef" (UID: "fc72f731-1258-4007-8119-30ad6b1837ef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.307077 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.307115 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q95n5\" (UniqueName: \"kubernetes.io/projected/fc72f731-1258-4007-8119-30ad6b1837ef-kube-api-access-q95n5\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.307128 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.307142 4788 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.307153 4788 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc72f731-1258-4007-8119-30ad6b1837ef-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.313677 4788 generic.go:334] "Generic (PLEG): container finished" podID="fc72f731-1258-4007-8119-30ad6b1837ef" containerID="b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802" exitCode=0 Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.313731 4788 generic.go:334] "Generic (PLEG): container finished" podID="fc72f731-1258-4007-8119-30ad6b1837ef" containerID="1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce" exitCode=143 Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.313731 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc72f731-1258-4007-8119-30ad6b1837ef","Type":"ContainerDied","Data":"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802"} Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.313786 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc72f731-1258-4007-8119-30ad6b1837ef","Type":"ContainerDied","Data":"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce"} Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.313801 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fc72f731-1258-4007-8119-30ad6b1837ef","Type":"ContainerDied","Data":"a6b0d440baad02e7ced2cc2b031ad7d37ff691fd0b09c8e341217c79109b6f1c"} Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.313732 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.313816 4788 scope.go:117] "RemoveContainer" containerID="b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.316419 4788 generic.go:334] "Generic (PLEG): container finished" podID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerID="697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084" exitCode=143 Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.316461 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3812032-b03d-4773-bb95-7ccbb252b7de","Type":"ContainerDied","Data":"697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084"} Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.348151 4788 scope.go:117] "RemoveContainer" containerID="1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.353076 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.361882 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.380066 4788 scope.go:117] "RemoveContainer" containerID="b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802" Feb 19 09:05:25 crc kubenswrapper[4788]: E0219 09:05:25.380542 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802\": container with ID starting with b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802 not found: ID does not exist" containerID="b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.380589 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802"} err="failed to get container status \"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802\": rpc error: code = NotFound desc = could not find container \"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802\": container with ID starting with b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802 not found: ID does not exist" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.380615 4788 scope.go:117] "RemoveContainer" containerID="1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce" Feb 19 09:05:25 crc kubenswrapper[4788]: E0219 09:05:25.381034 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce\": container with ID starting with 1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce not found: ID does not exist" containerID="1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.381072 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce"} err="failed to get container status \"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce\": rpc error: code = NotFound desc = could not find container \"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce\": container with ID starting with 1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce not found: ID does not exist" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.381103 4788 scope.go:117] "RemoveContainer" containerID="b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.381403 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802"} err="failed to get container status \"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802\": rpc error: code = NotFound desc = could not find container \"b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802\": container with ID starting with b0aea82a68620b61345ecbb280f56d400b77122ca9bb1658f4d5de3a197e4802 not found: ID does not exist" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.381423 4788 scope.go:117] "RemoveContainer" containerID="1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.381693 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce"} err="failed to get container status \"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce\": rpc error: code = NotFound desc = could not find container \"1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce\": container with ID starting with 1fdec44824e13ce9aef1772ce6356d8e3e01bdcff48d3f820cb6c1976ef4e4ce not found: ID does not exist" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396056 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:25 crc kubenswrapper[4788]: E0219 09:05:25.396436 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" containerName="nova-api-api" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396451 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" containerName="nova-api-api" Feb 19 09:05:25 crc kubenswrapper[4788]: E0219 09:05:25.396468 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" containerName="nova-api-log" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396474 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" containerName="nova-api-log" Feb 19 09:05:25 crc kubenswrapper[4788]: E0219 09:05:25.396483 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c8f267-83eb-4e22-9e99-78c3dc096823" containerName="nova-manage" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396491 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c8f267-83eb-4e22-9e99-78c3dc096823" containerName="nova-manage" Feb 19 09:05:25 crc kubenswrapper[4788]: E0219 09:05:25.396504 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e055219d-2144-4750-8255-9bc573b74163" containerName="init" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396510 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e055219d-2144-4750-8255-9bc573b74163" containerName="init" Feb 19 09:05:25 crc kubenswrapper[4788]: E0219 09:05:25.396525 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e055219d-2144-4750-8255-9bc573b74163" containerName="dnsmasq-dns" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396530 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e055219d-2144-4750-8255-9bc573b74163" containerName="dnsmasq-dns" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396684 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e055219d-2144-4750-8255-9bc573b74163" containerName="dnsmasq-dns" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396698 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" containerName="nova-api-api" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396718 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" containerName="nova-api-log" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.396727 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c8f267-83eb-4e22-9e99-78c3dc096823" containerName="nova-manage" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.397598 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.399542 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.399717 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.399937 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.437467 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.513299 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.513842 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2530b11c-c0ce-4ab3-9a0b-70060eb85184-logs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.513951 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.514093 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-config-data\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.514216 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4h42\" (UniqueName: \"kubernetes.io/projected/2530b11c-c0ce-4ab3-9a0b-70060eb85184-kube-api-access-b4h42\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.514351 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-public-tls-certs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.615991 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.616038 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2530b11c-c0ce-4ab3-9a0b-70060eb85184-logs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.616064 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.616115 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-config-data\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.616164 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4h42\" (UniqueName: \"kubernetes.io/projected/2530b11c-c0ce-4ab3-9a0b-70060eb85184-kube-api-access-b4h42\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.616201 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-public-tls-certs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.617190 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2530b11c-c0ce-4ab3-9a0b-70060eb85184-logs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.629972 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.630345 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-public-tls-certs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.635488 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.636199 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2530b11c-c0ce-4ab3-9a0b-70060eb85184-config-data\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.637772 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4h42\" (UniqueName: \"kubernetes.io/projected/2530b11c-c0ce-4ab3-9a0b-70060eb85184-kube-api-access-b4h42\") pod \"nova-api-0\" (UID: \"2530b11c-c0ce-4ab3-9a0b-70060eb85184\") " pod="openstack/nova-api-0" Feb 19 09:05:25 crc kubenswrapper[4788]: I0219 09:05:25.722359 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.189465 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 19 09:05:26 crc kubenswrapper[4788]: W0219 09:05:26.214889 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2530b11c_c0ce_4ab3_9a0b_70060eb85184.slice/crio-64045ce56335eb37e775bb789f649afc474b7df28503f2274920641491041518 WatchSource:0}: Error finding container 64045ce56335eb37e775bb789f649afc474b7df28503f2274920641491041518: Status 404 returned error can't find the container with id 64045ce56335eb37e775bb789f649afc474b7df28503f2274920641491041518 Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.334336 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530b11c-c0ce-4ab3-9a0b-70060eb85184","Type":"ContainerStarted","Data":"64045ce56335eb37e775bb789f649afc474b7df28503f2274920641491041518"} Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.337029 4788 generic.go:334] "Generic (PLEG): container finished" podID="cb45b82b-6e45-4780-a2e7-99dcddc48974" containerID="c77958088275cc498f43c2050efd4a95057a16e5178fb4c4224d8ac2ed6ee44d" exitCode=0 Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.337074 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb45b82b-6e45-4780-a2e7-99dcddc48974","Type":"ContainerDied","Data":"c77958088275cc498f43c2050efd4a95057a16e5178fb4c4224d8ac2ed6ee44d"} Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.616154 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.724859 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc72f731-1258-4007-8119-30ad6b1837ef" path="/var/lib/kubelet/pods/fc72f731-1258-4007-8119-30ad6b1837ef/volumes" Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.737724 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlr9c\" (UniqueName: \"kubernetes.io/projected/cb45b82b-6e45-4780-a2e7-99dcddc48974-kube-api-access-mlr9c\") pod \"cb45b82b-6e45-4780-a2e7-99dcddc48974\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.738012 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-combined-ca-bundle\") pod \"cb45b82b-6e45-4780-a2e7-99dcddc48974\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.738087 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-config-data\") pod \"cb45b82b-6e45-4780-a2e7-99dcddc48974\" (UID: \"cb45b82b-6e45-4780-a2e7-99dcddc48974\") " Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.752860 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb45b82b-6e45-4780-a2e7-99dcddc48974-kube-api-access-mlr9c" (OuterVolumeSpecName: "kube-api-access-mlr9c") pod "cb45b82b-6e45-4780-a2e7-99dcddc48974" (UID: "cb45b82b-6e45-4780-a2e7-99dcddc48974"). InnerVolumeSpecName "kube-api-access-mlr9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.766989 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-config-data" (OuterVolumeSpecName: "config-data") pod "cb45b82b-6e45-4780-a2e7-99dcddc48974" (UID: "cb45b82b-6e45-4780-a2e7-99dcddc48974"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.772003 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb45b82b-6e45-4780-a2e7-99dcddc48974" (UID: "cb45b82b-6e45-4780-a2e7-99dcddc48974"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.841059 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.841093 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb45b82b-6e45-4780-a2e7-99dcddc48974-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:26 crc kubenswrapper[4788]: I0219 09:05:26.841101 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlr9c\" (UniqueName: \"kubernetes.io/projected/cb45b82b-6e45-4780-a2e7-99dcddc48974-kube-api-access-mlr9c\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.351056 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530b11c-c0ce-4ab3-9a0b-70060eb85184","Type":"ContainerStarted","Data":"b2816335caeac1a92eb32bce2740c34da38d98351f0cd2e6ca9e9a3df27e4002"} Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.351118 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2530b11c-c0ce-4ab3-9a0b-70060eb85184","Type":"ContainerStarted","Data":"b97eb6fab87da3000bd4395417285c1fce7957ee5e0a44da79179aaa423d32da"} Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.353460 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"cb45b82b-6e45-4780-a2e7-99dcddc48974","Type":"ContainerDied","Data":"a8f3de6fbdd734033333da5218c4569baa79a42e6b32aace7751563aa3fd530d"} Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.353515 4788 scope.go:117] "RemoveContainer" containerID="c77958088275cc498f43c2050efd4a95057a16e5178fb4c4224d8ac2ed6ee44d" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.353529 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.389371 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.389354498 podStartE2EDuration="2.389354498s" podCreationTimestamp="2026-02-19 09:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:05:27.38613328 +0000 UTC m=+1229.374144792" watchObservedRunningTime="2026-02-19 09:05:27.389354498 +0000 UTC m=+1229.377365970" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.423068 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.437304 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.446471 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:05:27 crc kubenswrapper[4788]: E0219 09:05:27.446910 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb45b82b-6e45-4780-a2e7-99dcddc48974" containerName="nova-scheduler-scheduler" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.446931 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb45b82b-6e45-4780-a2e7-99dcddc48974" containerName="nova-scheduler-scheduler" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.447174 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb45b82b-6e45-4780-a2e7-99dcddc48974" containerName="nova-scheduler-scheduler" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.447991 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.454572 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.455662 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.553389 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t494b\" (UniqueName: \"kubernetes.io/projected/a6034eb4-975c-485a-b636-25fa666dd148-kube-api-access-t494b\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.553853 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6034eb4-975c-485a-b636-25fa666dd148-config-data\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.553916 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6034eb4-975c-485a-b636-25fa666dd148-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.656958 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6034eb4-975c-485a-b636-25fa666dd148-config-data\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.657076 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6034eb4-975c-485a-b636-25fa666dd148-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.657328 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t494b\" (UniqueName: \"kubernetes.io/projected/a6034eb4-975c-485a-b636-25fa666dd148-kube-api-access-t494b\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.666408 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6034eb4-975c-485a-b636-25fa666dd148-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.666508 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6034eb4-975c-485a-b636-25fa666dd148-config-data\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.681339 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t494b\" (UniqueName: \"kubernetes.io/projected/a6034eb4-975c-485a-b636-25fa666dd148-kube-api-access-t494b\") pod \"nova-scheduler-0\" (UID: \"a6034eb4-975c-485a-b636-25fa666dd148\") " pod="openstack/nova-scheduler-0" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.701428 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:49030->10.217.0.203:8775: read: connection reset by peer" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.701468 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:49034->10.217.0.203:8775: read: connection reset by peer" Feb 19 09:05:27 crc kubenswrapper[4788]: I0219 09:05:27.781651 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.301571 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.376112 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-combined-ca-bundle\") pod \"e3812032-b03d-4773-bb95-7ccbb252b7de\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.376206 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-config-data\") pod \"e3812032-b03d-4773-bb95-7ccbb252b7de\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.376372 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3812032-b03d-4773-bb95-7ccbb252b7de-logs\") pod \"e3812032-b03d-4773-bb95-7ccbb252b7de\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.376414 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl2cr\" (UniqueName: \"kubernetes.io/projected/e3812032-b03d-4773-bb95-7ccbb252b7de-kube-api-access-rl2cr\") pod \"e3812032-b03d-4773-bb95-7ccbb252b7de\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.376489 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-nova-metadata-tls-certs\") pod \"e3812032-b03d-4773-bb95-7ccbb252b7de\" (UID: \"e3812032-b03d-4773-bb95-7ccbb252b7de\") " Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.378033 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3812032-b03d-4773-bb95-7ccbb252b7de-logs" (OuterVolumeSpecName: "logs") pod "e3812032-b03d-4773-bb95-7ccbb252b7de" (UID: "e3812032-b03d-4773-bb95-7ccbb252b7de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.381950 4788 generic.go:334] "Generic (PLEG): container finished" podID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerID="37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc" exitCode=0 Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.382041 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.382113 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3812032-b03d-4773-bb95-7ccbb252b7de","Type":"ContainerDied","Data":"37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc"} Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.382153 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3812032-b03d-4773-bb95-7ccbb252b7de","Type":"ContainerDied","Data":"17e7d9c44446319f661c66c8f2f07214532fa665007f695c1362eecd3a43cdf7"} Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.382175 4788 scope.go:117] "RemoveContainer" containerID="37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.387465 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3812032-b03d-4773-bb95-7ccbb252b7de-kube-api-access-rl2cr" (OuterVolumeSpecName: "kube-api-access-rl2cr") pod "e3812032-b03d-4773-bb95-7ccbb252b7de" (UID: "e3812032-b03d-4773-bb95-7ccbb252b7de"). InnerVolumeSpecName "kube-api-access-rl2cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.425581 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3812032-b03d-4773-bb95-7ccbb252b7de" (UID: "e3812032-b03d-4773-bb95-7ccbb252b7de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:28 crc kubenswrapper[4788]: W0219 09:05:28.426164 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6034eb4_975c_485a_b636_25fa666dd148.slice/crio-c1f361c7dd333b9e49e20e18accb28070dafd20d9aaaedc950a80c9e82b43a7a WatchSource:0}: Error finding container c1f361c7dd333b9e49e20e18accb28070dafd20d9aaaedc950a80c9e82b43a7a: Status 404 returned error can't find the container with id c1f361c7dd333b9e49e20e18accb28070dafd20d9aaaedc950a80c9e82b43a7a Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.427614 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-config-data" (OuterVolumeSpecName: "config-data") pod "e3812032-b03d-4773-bb95-7ccbb252b7de" (UID: "e3812032-b03d-4773-bb95-7ccbb252b7de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.436358 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.444533 4788 scope.go:117] "RemoveContainer" containerID="697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.475219 4788 scope.go:117] "RemoveContainer" containerID="37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc" Feb 19 09:05:28 crc kubenswrapper[4788]: E0219 09:05:28.475734 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc\": container with ID starting with 37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc not found: ID does not exist" containerID="37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.475776 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc"} err="failed to get container status \"37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc\": rpc error: code = NotFound desc = could not find container \"37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc\": container with ID starting with 37c1abb48ac857d33e9e8b54e5b29d357c295387ed622c23f82eff2ab4cb8ddc not found: ID does not exist" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.475797 4788 scope.go:117] "RemoveContainer" containerID="697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084" Feb 19 09:05:28 crc kubenswrapper[4788]: E0219 09:05:28.476090 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084\": container with ID starting with 697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084 not found: ID does not exist" containerID="697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.476116 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084"} err="failed to get container status \"697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084\": rpc error: code = NotFound desc = could not find container \"697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084\": container with ID starting with 697d2190059df4628a37d59ec32431634b003681a0b6e6e4d2aacc4c412b5084 not found: ID does not exist" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.478321 4788 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.478338 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.478350 4788 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3812032-b03d-4773-bb95-7ccbb252b7de-logs\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.478360 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl2cr\" (UniqueName: \"kubernetes.io/projected/e3812032-b03d-4773-bb95-7ccbb252b7de-kube-api-access-rl2cr\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.479672 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e3812032-b03d-4773-bb95-7ccbb252b7de" (UID: "e3812032-b03d-4773-bb95-7ccbb252b7de"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.579926 4788 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3812032-b03d-4773-bb95-7ccbb252b7de-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.734604 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb45b82b-6e45-4780-a2e7-99dcddc48974" path="/var/lib/kubelet/pods/cb45b82b-6e45-4780-a2e7-99dcddc48974/volumes" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.821184 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.833621 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.844003 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:05:28 crc kubenswrapper[4788]: E0219 09:05:28.844454 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-metadata" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.844472 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-metadata" Feb 19 09:05:28 crc kubenswrapper[4788]: E0219 09:05:28.844483 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-log" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.844489 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-log" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.844676 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-log" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.844696 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" containerName="nova-metadata-metadata" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.845723 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.847884 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.848883 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 19 09:05:28 crc kubenswrapper[4788]: I0219 09:05:28.853148 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.002913 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-logs\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.003231 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.003273 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.003295 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l747n\" (UniqueName: \"kubernetes.io/projected/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-kube-api-access-l747n\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.003346 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-config-data\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.104733 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-logs\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.104881 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.104910 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.104934 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l747n\" (UniqueName: \"kubernetes.io/projected/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-kube-api-access-l747n\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.104976 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-config-data\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.106035 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-logs\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.111203 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.111350 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-config-data\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.113675 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.130207 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l747n\" (UniqueName: \"kubernetes.io/projected/fb5bf2a2-d945-4fab-a232-ee95c75d94d0-kube-api-access-l747n\") pod \"nova-metadata-0\" (UID: \"fb5bf2a2-d945-4fab-a232-ee95c75d94d0\") " pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.164502 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.400835 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6034eb4-975c-485a-b636-25fa666dd148","Type":"ContainerStarted","Data":"d28f88d63a0df6d11bc4d5d425beab3a4d4244b666ad02d3d2b54d27663b491a"} Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.400927 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a6034eb4-975c-485a-b636-25fa666dd148","Type":"ContainerStarted","Data":"c1f361c7dd333b9e49e20e18accb28070dafd20d9aaaedc950a80c9e82b43a7a"} Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.433862 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.433841265 podStartE2EDuration="2.433841265s" podCreationTimestamp="2026-02-19 09:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:05:29.4323917 +0000 UTC m=+1231.420403172" watchObservedRunningTime="2026-02-19 09:05:29.433841265 +0000 UTC m=+1231.421852747" Feb 19 09:05:29 crc kubenswrapper[4788]: I0219 09:05:29.692929 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 19 09:05:29 crc kubenswrapper[4788]: W0219 09:05:29.699346 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb5bf2a2_d945_4fab_a232_ee95c75d94d0.slice/crio-0dde09a3107fa1185095952196890992d6c6bbb6324cad4e4ab76b0df157943b WatchSource:0}: Error finding container 0dde09a3107fa1185095952196890992d6c6bbb6324cad4e4ab76b0df157943b: Status 404 returned error can't find the container with id 0dde09a3107fa1185095952196890992d6c6bbb6324cad4e4ab76b0df157943b Feb 19 09:05:30 crc kubenswrapper[4788]: I0219 09:05:30.416093 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb5bf2a2-d945-4fab-a232-ee95c75d94d0","Type":"ContainerStarted","Data":"f82e296164a8ed7c2c085e15b80c0890e8c9fc77d7f059992b2f63007f623488"} Feb 19 09:05:30 crc kubenswrapper[4788]: I0219 09:05:30.416708 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb5bf2a2-d945-4fab-a232-ee95c75d94d0","Type":"ContainerStarted","Data":"bc2830b60e6fcec1b1e6699b3dbd5e7ac7374d2f889c3b3d1df1749865c71d26"} Feb 19 09:05:30 crc kubenswrapper[4788]: I0219 09:05:30.416733 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb5bf2a2-d945-4fab-a232-ee95c75d94d0","Type":"ContainerStarted","Data":"0dde09a3107fa1185095952196890992d6c6bbb6324cad4e4ab76b0df157943b"} Feb 19 09:05:30 crc kubenswrapper[4788]: I0219 09:05:30.443270 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4432358929999998 podStartE2EDuration="2.443235893s" podCreationTimestamp="2026-02-19 09:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:05:30.442460045 +0000 UTC m=+1232.430471527" watchObservedRunningTime="2026-02-19 09:05:30.443235893 +0000 UTC m=+1232.431247365" Feb 19 09:05:30 crc kubenswrapper[4788]: I0219 09:05:30.724017 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3812032-b03d-4773-bb95-7ccbb252b7de" path="/var/lib/kubelet/pods/e3812032-b03d-4773-bb95-7ccbb252b7de/volumes" Feb 19 09:05:32 crc kubenswrapper[4788]: I0219 09:05:32.782259 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 19 09:05:34 crc kubenswrapper[4788]: I0219 09:05:34.165973 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 09:05:34 crc kubenswrapper[4788]: I0219 09:05:34.166108 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 19 09:05:35 crc kubenswrapper[4788]: I0219 09:05:35.723467 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 09:05:35 crc kubenswrapper[4788]: I0219 09:05:35.723847 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 19 09:05:36 crc kubenswrapper[4788]: I0219 09:05:36.751586 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2530b11c-c0ce-4ab3-9a0b-70060eb85184" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 09:05:36 crc kubenswrapper[4788]: I0219 09:05:36.751524 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2530b11c-c0ce-4ab3-9a0b-70060eb85184" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 09:05:37 crc kubenswrapper[4788]: I0219 09:05:37.782809 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 19 09:05:37 crc kubenswrapper[4788]: I0219 09:05:37.809201 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 19 09:05:38 crc kubenswrapper[4788]: I0219 09:05:38.535859 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 19 09:05:39 crc kubenswrapper[4788]: I0219 09:05:39.165584 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 09:05:39 crc kubenswrapper[4788]: I0219 09:05:39.165650 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 19 09:05:40 crc kubenswrapper[4788]: I0219 09:05:40.179405 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb5bf2a2-d945-4fab-a232-ee95c75d94d0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 09:05:40 crc kubenswrapper[4788]: I0219 09:05:40.179448 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb5bf2a2-d945-4fab-a232-ee95c75d94d0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 19 09:05:45 crc kubenswrapper[4788]: I0219 09:05:45.730715 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 09:05:45 crc kubenswrapper[4788]: I0219 09:05:45.732587 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 09:05:45 crc kubenswrapper[4788]: I0219 09:05:45.733096 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 19 09:05:45 crc kubenswrapper[4788]: I0219 09:05:45.739331 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 09:05:46 crc kubenswrapper[4788]: I0219 09:05:46.023330 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 19 09:05:46 crc kubenswrapper[4788]: I0219 09:05:46.577993 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 19 09:05:46 crc kubenswrapper[4788]: I0219 09:05:46.588068 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 19 09:05:49 crc kubenswrapper[4788]: I0219 09:05:49.181019 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 09:05:49 crc kubenswrapper[4788]: I0219 09:05:49.182574 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 19 09:05:49 crc kubenswrapper[4788]: I0219 09:05:49.186982 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 09:05:49 crc kubenswrapper[4788]: I0219 09:05:49.191003 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 19 09:05:57 crc kubenswrapper[4788]: I0219 09:05:57.220073 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:05:58 crc kubenswrapper[4788]: I0219 09:05:58.029963 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:06:01 crc kubenswrapper[4788]: I0219 09:06:01.749287 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerName="rabbitmq" containerID="cri-o://4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52" gracePeriod=604796 Feb 19 09:06:02 crc kubenswrapper[4788]: I0219 09:06:02.858822 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerName="rabbitmq" containerID="cri-o://93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea" gracePeriod=604796 Feb 19 09:06:06 crc kubenswrapper[4788]: I0219 09:06:06.596225 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Feb 19 09:06:06 crc kubenswrapper[4788]: I0219 09:06:06.850300 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.96:5671: connect: connection refused" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.485530 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591464 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591601 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-plugins\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591646 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad57631d-1772-49f0-ae6b-f16ee556e9c4-pod-info\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591684 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-confd\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591730 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-config-data\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591767 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad57631d-1772-49f0-ae6b-f16ee556e9c4-erlang-cookie-secret\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591798 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-erlang-cookie\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591822 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8lgs\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-kube-api-access-c8lgs\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591861 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-tls\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591888 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-server-conf\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.591930 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-plugins-conf\") pod \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\" (UID: \"ad57631d-1772-49f0-ae6b-f16ee556e9c4\") " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.592924 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.594215 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.594445 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.598184 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.600948 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-kube-api-access-c8lgs" (OuterVolumeSpecName: "kube-api-access-c8lgs") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "kube-api-access-c8lgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.601035 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.602692 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad57631d-1772-49f0-ae6b-f16ee556e9c4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.605330 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ad57631d-1772-49f0-ae6b-f16ee556e9c4-pod-info" (OuterVolumeSpecName: "pod-info") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.646764 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-config-data" (OuterVolumeSpecName: "config-data") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.653219 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-server-conf" (OuterVolumeSpecName: "server-conf") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694327 4788 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694362 4788 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694376 4788 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694410 4788 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694486 4788 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694497 4788 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad57631d-1772-49f0-ae6b-f16ee556e9c4-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694505 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad57631d-1772-49f0-ae6b-f16ee556e9c4-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694512 4788 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad57631d-1772-49f0-ae6b-f16ee556e9c4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694534 4788 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.694549 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8lgs\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-kube-api-access-c8lgs\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.731726 4788 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.755085 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ad57631d-1772-49f0-ae6b-f16ee556e9c4" (UID: "ad57631d-1772-49f0-ae6b-f16ee556e9c4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.790433 4788 generic.go:334] "Generic (PLEG): container finished" podID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerID="4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52" exitCode=0 Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.790487 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad57631d-1772-49f0-ae6b-f16ee556e9c4","Type":"ContainerDied","Data":"4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52"} Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.790528 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad57631d-1772-49f0-ae6b-f16ee556e9c4","Type":"ContainerDied","Data":"4d41512668aac09f7d3b537846306904627738615997a0bb5873d6e1afd45ed9"} Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.790555 4788 scope.go:117] "RemoveContainer" containerID="4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.791011 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.797959 4788 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad57631d-1772-49f0-ae6b-f16ee556e9c4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.797990 4788 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.836677 4788 scope.go:117] "RemoveContainer" containerID="deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.837861 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.856658 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.868729 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:06:08 crc kubenswrapper[4788]: E0219 09:06:08.869138 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerName="rabbitmq" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.869154 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerName="rabbitmq" Feb 19 09:06:08 crc kubenswrapper[4788]: E0219 09:06:08.869174 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerName="setup-container" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.869180 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerName="setup-container" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.869359 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" containerName="rabbitmq" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.870332 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.871678 4788 scope.go:117] "RemoveContainer" containerID="4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.874281 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.874566 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.874793 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.875021 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vlc28" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.875198 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 19 09:06:08 crc kubenswrapper[4788]: E0219 09:06:08.875391 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52\": container with ID starting with 4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52 not found: ID does not exist" containerID="4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.875440 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52"} err="failed to get container status \"4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52\": rpc error: code = NotFound desc = could not find container \"4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52\": container with ID starting with 4019abff3f9af6de8e3e5c9c1d41f3f0cbb7ee4f36b10555194a1ca325c3df52 not found: ID does not exist" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.875464 4788 scope.go:117] "RemoveContainer" containerID="deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.875579 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.875662 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 19 09:06:08 crc kubenswrapper[4788]: E0219 09:06:08.877469 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36\": container with ID starting with deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36 not found: ID does not exist" containerID="deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.877502 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36"} err="failed to get container status \"deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36\": rpc error: code = NotFound desc = could not find container \"deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36\": container with ID starting with deddce3e5fbca7e5e4f3590a860e9be3c48b26f24fd8b65ebae966cfb57f2f36 not found: ID does not exist" Feb 19 09:06:08 crc kubenswrapper[4788]: I0219 09:06:08.887078 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.002956 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003013 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003045 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/77af71c0-581a-4e58-9429-bb14901b1a1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003071 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mh5\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-kube-api-access-m7mh5\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003174 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/77af71c0-581a-4e58-9429-bb14901b1a1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003230 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003279 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003309 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003331 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003366 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.003421 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109190 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109379 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109413 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109469 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/77af71c0-581a-4e58-9429-bb14901b1a1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109509 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mh5\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-kube-api-access-m7mh5\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109605 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/77af71c0-581a-4e58-9429-bb14901b1a1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109656 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109696 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109720 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109739 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109780 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.109769 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.110198 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.110361 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.110407 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.110589 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.110993 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/77af71c0-581a-4e58-9429-bb14901b1a1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.114305 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.116043 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/77af71c0-581a-4e58-9429-bb14901b1a1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.116061 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.116384 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/77af71c0-581a-4e58-9429-bb14901b1a1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.128541 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mh5\" (UniqueName: \"kubernetes.io/projected/77af71c0-581a-4e58-9429-bb14901b1a1d-kube-api-access-m7mh5\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.145607 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"77af71c0-581a-4e58-9429-bb14901b1a1d\") " pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.201398 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.608802 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.724984 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-server-conf\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.725893 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-plugins\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726022 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb0abe11-b278-4a3a-aeda-3e08a603924b-erlang-cookie-secret\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726138 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-confd\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726233 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-plugins-conf\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726346 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bl7r\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-kube-api-access-6bl7r\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726430 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726574 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-erlang-cookie\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726742 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-config-data\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726863 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-tls\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.726989 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb0abe11-b278-4a3a-aeda-3e08a603924b-pod-info\") pod \"bb0abe11-b278-4a3a-aeda-3e08a603924b\" (UID: \"bb0abe11-b278-4a3a-aeda-3e08a603924b\") " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.728798 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.728802 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.733659 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0abe11-b278-4a3a-aeda-3e08a603924b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.736699 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.737397 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-kube-api-access-6bl7r" (OuterVolumeSpecName: "kube-api-access-6bl7r") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "kube-api-access-6bl7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.738842 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.740217 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.754407 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bb0abe11-b278-4a3a-aeda-3e08a603924b-pod-info" (OuterVolumeSpecName: "pod-info") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.781111 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-config-data" (OuterVolumeSpecName: "config-data") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.810640 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.823818 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-server-conf" (OuterVolumeSpecName: "server-conf") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.824869 4788 generic.go:334] "Generic (PLEG): container finished" podID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerID="93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea" exitCode=0 Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.825070 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.825147 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb0abe11-b278-4a3a-aeda-3e08a603924b","Type":"ContainerDied","Data":"93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea"} Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.825184 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bb0abe11-b278-4a3a-aeda-3e08a603924b","Type":"ContainerDied","Data":"6c0b6e48330d9271c831ac45b9ffe57b73b71e8d2ea969cf1a926d8aa4075997"} Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.825204 4788 scope.go:117] "RemoveContainer" containerID="93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.829597 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bl7r\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-kube-api-access-6bl7r\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.829633 4788 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.830156 4788 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.830196 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.830208 4788 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.830218 4788 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bb0abe11-b278-4a3a-aeda-3e08a603924b-pod-info\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.830228 4788 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-server-conf\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.830239 4788 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.830264 4788 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bb0abe11-b278-4a3a-aeda-3e08a603924b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.830275 4788 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bb0abe11-b278-4a3a-aeda-3e08a603924b-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.857541 4788 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.860818 4788 scope.go:117] "RemoveContainer" containerID="60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.883552 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bb0abe11-b278-4a3a-aeda-3e08a603924b" (UID: "bb0abe11-b278-4a3a-aeda-3e08a603924b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.897828 4788 scope.go:117] "RemoveContainer" containerID="93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea" Feb 19 09:06:09 crc kubenswrapper[4788]: E0219 09:06:09.898262 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea\": container with ID starting with 93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea not found: ID does not exist" containerID="93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.898288 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea"} err="failed to get container status \"93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea\": rpc error: code = NotFound desc = could not find container \"93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea\": container with ID starting with 93b48ea1e098a9845923fb506b1b8c105d22f6d2ca761f64a6ef9e4fa86157ea not found: ID does not exist" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.898311 4788 scope.go:117] "RemoveContainer" containerID="60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5" Feb 19 09:06:09 crc kubenswrapper[4788]: E0219 09:06:09.898475 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5\": container with ID starting with 60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5 not found: ID does not exist" containerID="60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.898489 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5"} err="failed to get container status \"60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5\": rpc error: code = NotFound desc = could not find container \"60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5\": container with ID starting with 60fda4c512a83dbf7ddc7048784297b75972741506cf905455ea7286b6869cf5 not found: ID does not exist" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.931539 4788 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bb0abe11-b278-4a3a-aeda-3e08a603924b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:09 crc kubenswrapper[4788]: I0219 09:06:09.931575 4788 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.165189 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.178137 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.200320 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:06:10 crc kubenswrapper[4788]: E0219 09:06:10.200699 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerName="rabbitmq" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.200717 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerName="rabbitmq" Feb 19 09:06:10 crc kubenswrapper[4788]: E0219 09:06:10.200743 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerName="setup-container" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.200750 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerName="setup-container" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.200926 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0abe11-b278-4a3a-aeda-3e08a603924b" containerName="rabbitmq" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.201860 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.204067 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2bt5k" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.206937 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.207055 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.207074 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.207819 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.210421 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.213594 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.224830 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338505 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnxtx\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-kube-api-access-tnxtx\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338598 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338639 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338733 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338801 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338824 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338847 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ace23c-c0e0-4390-85fc-1b50f8d72a66-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338869 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338922 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338956 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ace23c-c0e0-4390-85fc-1b50f8d72a66-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.338988 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440122 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440171 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440192 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ace23c-c0e0-4390-85fc-1b50f8d72a66-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440210 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440239 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440286 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ace23c-c0e0-4390-85fc-1b50f8d72a66-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440325 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440373 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnxtx\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-kube-api-access-tnxtx\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440418 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440456 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440533 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440761 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.440883 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.441282 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.442226 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.442363 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.443300 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ace23c-c0e0-4390-85fc-1b50f8d72a66-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.445201 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.445265 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ace23c-c0e0-4390-85fc-1b50f8d72a66-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.446068 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.446150 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ace23c-c0e0-4390-85fc-1b50f8d72a66-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.460327 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnxtx\" (UniqueName: \"kubernetes.io/projected/01ace23c-c0e0-4390-85fc-1b50f8d72a66-kube-api-access-tnxtx\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.474798 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"01ace23c-c0e0-4390-85fc-1b50f8d72a66\") " pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.569527 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.752201 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad57631d-1772-49f0-ae6b-f16ee556e9c4" path="/var/lib/kubelet/pods/ad57631d-1772-49f0-ae6b-f16ee556e9c4/volumes" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.753476 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb0abe11-b278-4a3a-aeda-3e08a603924b" path="/var/lib/kubelet/pods/bb0abe11-b278-4a3a-aeda-3e08a603924b/volumes" Feb 19 09:06:10 crc kubenswrapper[4788]: I0219 09:06:10.840373 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"77af71c0-581a-4e58-9429-bb14901b1a1d","Type":"ContainerStarted","Data":"c5a5d04a942902c5840983c57a6c161e2a554e12bfee08f56fb8d637d6aba5f6"} Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.078974 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.552130 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-nsdqg"] Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.553772 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.555434 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.576131 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-nsdqg"] Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.664166 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76hn\" (UniqueName: \"kubernetes.io/projected/a9503da9-4341-4fa9-9570-d03ae14040ae-kube-api-access-x76hn\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.664295 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.664326 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.664380 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-config\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.664424 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.664528 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.664559 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.768203 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.766428 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.768401 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.769513 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.769653 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-config\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.769708 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.769894 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.769959 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.770007 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76hn\" (UniqueName: \"kubernetes.io/projected/a9503da9-4341-4fa9-9570-d03ae14040ae-kube-api-access-x76hn\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.771626 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-config\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.772924 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.775679 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.776463 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.792336 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76hn\" (UniqueName: \"kubernetes.io/projected/a9503da9-4341-4fa9-9570-d03ae14040ae-kube-api-access-x76hn\") pod \"dnsmasq-dns-7d84b4d45c-nsdqg\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.851398 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"77af71c0-581a-4e58-9429-bb14901b1a1d","Type":"ContainerStarted","Data":"dbf6e3fadf7e39b8b236d39b4588c25ad91aa90123a170fcfc4f889c321c8070"} Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.853183 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ace23c-c0e0-4390-85fc-1b50f8d72a66","Type":"ContainerStarted","Data":"7d2483d58c0ab68863e807553d27b004e96af5c3d9ea0a99c4e244d6782fe1a9"} Feb 19 09:06:11 crc kubenswrapper[4788]: I0219 09:06:11.901112 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:12 crc kubenswrapper[4788]: I0219 09:06:12.366500 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-nsdqg"] Feb 19 09:06:12 crc kubenswrapper[4788]: I0219 09:06:12.864982 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ace23c-c0e0-4390-85fc-1b50f8d72a66","Type":"ContainerStarted","Data":"502aad58cec5be0a8ea1ef1bc8880d48df58931c0b99293906c3454b4df6be43"} Feb 19 09:06:12 crc kubenswrapper[4788]: I0219 09:06:12.867627 4788 generic.go:334] "Generic (PLEG): container finished" podID="a9503da9-4341-4fa9-9570-d03ae14040ae" containerID="4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880" exitCode=0 Feb 19 09:06:12 crc kubenswrapper[4788]: I0219 09:06:12.867738 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" event={"ID":"a9503da9-4341-4fa9-9570-d03ae14040ae","Type":"ContainerDied","Data":"4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880"} Feb 19 09:06:12 crc kubenswrapper[4788]: I0219 09:06:12.867874 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" event={"ID":"a9503da9-4341-4fa9-9570-d03ae14040ae","Type":"ContainerStarted","Data":"6f3d0a4600b475cf6f9efdfe6fef80a90f42ca2324be50f6951dd3e94aaa3f5d"} Feb 19 09:06:13 crc kubenswrapper[4788]: I0219 09:06:13.891611 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" event={"ID":"a9503da9-4341-4fa9-9570-d03ae14040ae","Type":"ContainerStarted","Data":"658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d"} Feb 19 09:06:13 crc kubenswrapper[4788]: I0219 09:06:13.924469 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" podStartSLOduration=2.924442883 podStartE2EDuration="2.924442883s" podCreationTimestamp="2026-02-19 09:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:06:13.917975186 +0000 UTC m=+1275.905986698" watchObservedRunningTime="2026-02-19 09:06:13.924442883 +0000 UTC m=+1275.912454385" Feb 19 09:06:14 crc kubenswrapper[4788]: I0219 09:06:14.899985 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:21 crc kubenswrapper[4788]: I0219 09:06:21.903486 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:21 crc kubenswrapper[4788]: I0219 09:06:21.984459 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4"] Feb 19 09:06:21 crc kubenswrapper[4788]: I0219 09:06:21.986054 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" podUID="bb86749e-3ca7-473a-88ee-26e930c57552" containerName="dnsmasq-dns" containerID="cri-o://a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94" gracePeriod=10 Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.139034 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-67csm"] Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.139372 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.139423 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.140510 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.197094 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-67csm"] Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.198218 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.198346 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.198399 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-config\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.198471 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.198525 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.198581 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.198610 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6dwm\" (UniqueName: \"kubernetes.io/projected/4830cf68-1ea9-4b7f-899b-5a5935bc2230-kube-api-access-l6dwm\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.302440 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.302500 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.302520 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6dwm\" (UniqueName: \"kubernetes.io/projected/4830cf68-1ea9-4b7f-899b-5a5935bc2230-kube-api-access-l6dwm\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.302569 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.302620 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.302658 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-config\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.302709 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.303640 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.304279 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.304295 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.304821 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.305018 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.307317 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4830cf68-1ea9-4b7f-899b-5a5935bc2230-config\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.337272 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6dwm\" (UniqueName: \"kubernetes.io/projected/4830cf68-1ea9-4b7f-899b-5a5935bc2230-kube-api-access-l6dwm\") pod \"dnsmasq-dns-6f6df4f56c-67csm\" (UID: \"4830cf68-1ea9-4b7f-899b-5a5935bc2230\") " pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:22 crc kubenswrapper[4788]: I0219 09:06:22.480704 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.669170 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.711973 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-config\") pod \"bb86749e-3ca7-473a-88ee-26e930c57552\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.712019 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-nb\") pod \"bb86749e-3ca7-473a-88ee-26e930c57552\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.712063 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-swift-storage-0\") pod \"bb86749e-3ca7-473a-88ee-26e930c57552\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.712132 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lx8f\" (UniqueName: \"kubernetes.io/projected/bb86749e-3ca7-473a-88ee-26e930c57552-kube-api-access-8lx8f\") pod \"bb86749e-3ca7-473a-88ee-26e930c57552\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.712421 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-sb\") pod \"bb86749e-3ca7-473a-88ee-26e930c57552\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.712491 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-svc\") pod \"bb86749e-3ca7-473a-88ee-26e930c57552\" (UID: \"bb86749e-3ca7-473a-88ee-26e930c57552\") " Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.734550 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb86749e-3ca7-473a-88ee-26e930c57552-kube-api-access-8lx8f" (OuterVolumeSpecName: "kube-api-access-8lx8f") pod "bb86749e-3ca7-473a-88ee-26e930c57552" (UID: "bb86749e-3ca7-473a-88ee-26e930c57552"). InnerVolumeSpecName "kube-api-access-8lx8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.769436 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-config" (OuterVolumeSpecName: "config") pod "bb86749e-3ca7-473a-88ee-26e930c57552" (UID: "bb86749e-3ca7-473a-88ee-26e930c57552"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.781510 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb86749e-3ca7-473a-88ee-26e930c57552" (UID: "bb86749e-3ca7-473a-88ee-26e930c57552"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.782890 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb86749e-3ca7-473a-88ee-26e930c57552" (UID: "bb86749e-3ca7-473a-88ee-26e930c57552"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.792953 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb86749e-3ca7-473a-88ee-26e930c57552" (UID: "bb86749e-3ca7-473a-88ee-26e930c57552"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.793374 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb86749e-3ca7-473a-88ee-26e930c57552" (UID: "bb86749e-3ca7-473a-88ee-26e930c57552"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.817052 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.817083 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lx8f\" (UniqueName: \"kubernetes.io/projected/bb86749e-3ca7-473a-88ee-26e930c57552-kube-api-access-8lx8f\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.817096 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.817106 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.817115 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.817123 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb86749e-3ca7-473a-88ee-26e930c57552-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.995302 4788 generic.go:334] "Generic (PLEG): container finished" podID="bb86749e-3ca7-473a-88ee-26e930c57552" containerID="a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94" exitCode=0 Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.995350 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" event={"ID":"bb86749e-3ca7-473a-88ee-26e930c57552","Type":"ContainerDied","Data":"a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94"} Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.995384 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" event={"ID":"bb86749e-3ca7-473a-88ee-26e930c57552","Type":"ContainerDied","Data":"9b16c3ff2d226d58be4c39a6c9946d9e6ba26a34c1606b24b590146b41a2aa93"} Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.995399 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:22.995410 4788 scope.go:117] "RemoveContainer" containerID="a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:23.025475 4788 scope.go:117] "RemoveContainer" containerID="670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:23.030582 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4"] Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:23.040162 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4"] Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:23.051891 4788 scope.go:117] "RemoveContainer" containerID="a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94" Feb 19 09:06:23 crc kubenswrapper[4788]: E0219 09:06:23.052426 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94\": container with ID starting with a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94 not found: ID does not exist" containerID="a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:23.052472 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94"} err="failed to get container status \"a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94\": rpc error: code = NotFound desc = could not find container \"a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94\": container with ID starting with a46fa43fd1f8bc8c5169e9fd6082d0dc2801826eead66fa72355763880f76c94 not found: ID does not exist" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:23.052502 4788 scope.go:117] "RemoveContainer" containerID="670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12" Feb 19 09:06:23 crc kubenswrapper[4788]: E0219 09:06:23.052786 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12\": container with ID starting with 670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12 not found: ID does not exist" containerID="670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:23.052816 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12"} err="failed to get container status \"670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12\": rpc error: code = NotFound desc = could not find container \"670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12\": container with ID starting with 670f3e1927fd871c0a41b0c44b9c32311e4c4e9f6f10b7eaaba810f99a2b7d12 not found: ID does not exist" Feb 19 09:06:23 crc kubenswrapper[4788]: I0219 09:06:23.407570 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-67csm"] Feb 19 09:06:24 crc kubenswrapper[4788]: I0219 09:06:24.003988 4788 generic.go:334] "Generic (PLEG): container finished" podID="4830cf68-1ea9-4b7f-899b-5a5935bc2230" containerID="df931df4c6e2e07eeb3bf9423da0bae60fb95857624599db6e77c0babe831354" exitCode=0 Feb 19 09:06:24 crc kubenswrapper[4788]: I0219 09:06:24.004438 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" event={"ID":"4830cf68-1ea9-4b7f-899b-5a5935bc2230","Type":"ContainerDied","Data":"df931df4c6e2e07eeb3bf9423da0bae60fb95857624599db6e77c0babe831354"} Feb 19 09:06:24 crc kubenswrapper[4788]: I0219 09:06:24.004469 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" event={"ID":"4830cf68-1ea9-4b7f-899b-5a5935bc2230","Type":"ContainerStarted","Data":"21e4b3509f64a6b84c5ac55fd292525566f22b3e71fe608b7284c280c5b7df34"} Feb 19 09:06:24 crc kubenswrapper[4788]: I0219 09:06:24.730617 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb86749e-3ca7-473a-88ee-26e930c57552" path="/var/lib/kubelet/pods/bb86749e-3ca7-473a-88ee-26e930c57552/volumes" Feb 19 09:06:25 crc kubenswrapper[4788]: I0219 09:06:25.017085 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" event={"ID":"4830cf68-1ea9-4b7f-899b-5a5935bc2230","Type":"ContainerStarted","Data":"3a3d93ca08c49a64e8a5362aa006e24d8d2cc802cbeb720ba963709941c56d6f"} Feb 19 09:06:25 crc kubenswrapper[4788]: I0219 09:06:25.017386 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:25 crc kubenswrapper[4788]: I0219 09:06:25.045043 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" podStartSLOduration=3.045019994 podStartE2EDuration="3.045019994s" podCreationTimestamp="2026-02-19 09:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:06:25.035607687 +0000 UTC m=+1287.023619159" watchObservedRunningTime="2026-02-19 09:06:25.045019994 +0000 UTC m=+1287.033031476" Feb 19 09:06:27 crc kubenswrapper[4788]: I0219 09:06:27.571419 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7bbf7cf9-qhlb4" podUID="bb86749e-3ca7-473a-88ee-26e930c57552" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.210:5353: i/o timeout" Feb 19 09:06:32 crc kubenswrapper[4788]: I0219 09:06:32.572201 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-67csm" Feb 19 09:06:32 crc kubenswrapper[4788]: I0219 09:06:32.650239 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-nsdqg"] Feb 19 09:06:32 crc kubenswrapper[4788]: I0219 09:06:32.650483 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" podUID="a9503da9-4341-4fa9-9570-d03ae14040ae" containerName="dnsmasq-dns" containerID="cri-o://658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d" gracePeriod=10 Feb 19 09:06:33 crc kubenswrapper[4788]: I0219 09:06:33.642676 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.033618 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-config\") pod \"a9503da9-4341-4fa9-9570-d03ae14040ae\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.033694 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-swift-storage-0\") pod \"a9503da9-4341-4fa9-9570-d03ae14040ae\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.033726 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-svc\") pod \"a9503da9-4341-4fa9-9570-d03ae14040ae\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.033743 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-sb\") pod \"a9503da9-4341-4fa9-9570-d03ae14040ae\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.033779 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-nb\") pod \"a9503da9-4341-4fa9-9570-d03ae14040ae\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.033827 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-openstack-edpm-ipam\") pod \"a9503da9-4341-4fa9-9570-d03ae14040ae\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.033915 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x76hn\" (UniqueName: \"kubernetes.io/projected/a9503da9-4341-4fa9-9570-d03ae14040ae-kube-api-access-x76hn\") pod \"a9503da9-4341-4fa9-9570-d03ae14040ae\" (UID: \"a9503da9-4341-4fa9-9570-d03ae14040ae\") " Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.053366 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9503da9-4341-4fa9-9570-d03ae14040ae-kube-api-access-x76hn" (OuterVolumeSpecName: "kube-api-access-x76hn") pod "a9503da9-4341-4fa9-9570-d03ae14040ae" (UID: "a9503da9-4341-4fa9-9570-d03ae14040ae"). InnerVolumeSpecName "kube-api-access-x76hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.090318 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9503da9-4341-4fa9-9570-d03ae14040ae" (UID: "a9503da9-4341-4fa9-9570-d03ae14040ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.099013 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a9503da9-4341-4fa9-9570-d03ae14040ae" (UID: "a9503da9-4341-4fa9-9570-d03ae14040ae"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.102221 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9503da9-4341-4fa9-9570-d03ae14040ae" (UID: "a9503da9-4341-4fa9-9570-d03ae14040ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.103962 4788 generic.go:334] "Generic (PLEG): container finished" podID="a9503da9-4341-4fa9-9570-d03ae14040ae" containerID="658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d" exitCode=0 Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.103999 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" event={"ID":"a9503da9-4341-4fa9-9570-d03ae14040ae","Type":"ContainerDied","Data":"658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d"} Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.104024 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" event={"ID":"a9503da9-4341-4fa9-9570-d03ae14040ae","Type":"ContainerDied","Data":"6f3d0a4600b475cf6f9efdfe6fef80a90f42ca2324be50f6951dd3e94aaa3f5d"} Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.104040 4788 scope.go:117] "RemoveContainer" containerID="658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.104060 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-nsdqg" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.109818 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9503da9-4341-4fa9-9570-d03ae14040ae" (UID: "a9503da9-4341-4fa9-9570-d03ae14040ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.110894 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a9503da9-4341-4fa9-9570-d03ae14040ae" (UID: "a9503da9-4341-4fa9-9570-d03ae14040ae"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.123693 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-config" (OuterVolumeSpecName: "config") pod "a9503da9-4341-4fa9-9570-d03ae14040ae" (UID: "a9503da9-4341-4fa9-9570-d03ae14040ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.136228 4788 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.136289 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.136299 4788 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.136307 4788 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.136315 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.136325 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x76hn\" (UniqueName: \"kubernetes.io/projected/a9503da9-4341-4fa9-9570-d03ae14040ae-kube-api-access-x76hn\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.136334 4788 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9503da9-4341-4fa9-9570-d03ae14040ae-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.204506 4788 scope.go:117] "RemoveContainer" containerID="4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.234112 4788 scope.go:117] "RemoveContainer" containerID="658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d" Feb 19 09:06:34 crc kubenswrapper[4788]: E0219 09:06:34.234588 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d\": container with ID starting with 658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d not found: ID does not exist" containerID="658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.234633 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d"} err="failed to get container status \"658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d\": rpc error: code = NotFound desc = could not find container \"658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d\": container with ID starting with 658fd5d0b0582bd66322b5dd664b921371ed6f55c9c13f0fd3b86864dd790a4d not found: ID does not exist" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.234662 4788 scope.go:117] "RemoveContainer" containerID="4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880" Feb 19 09:06:34 crc kubenswrapper[4788]: E0219 09:06:34.234999 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880\": container with ID starting with 4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880 not found: ID does not exist" containerID="4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.235023 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880"} err="failed to get container status \"4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880\": rpc error: code = NotFound desc = could not find container \"4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880\": container with ID starting with 4de205ede351638840f3c17b0e02592651eb8b0c216efb3c7733c221ae536880 not found: ID does not exist" Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.446081 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-nsdqg"] Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.455989 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-nsdqg"] Feb 19 09:06:34 crc kubenswrapper[4788]: I0219 09:06:34.728400 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9503da9-4341-4fa9-9570-d03ae14040ae" path="/var/lib/kubelet/pods/a9503da9-4341-4fa9-9570-d03ae14040ae/volumes" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.253138 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn"] Feb 19 09:06:41 crc kubenswrapper[4788]: E0219 09:06:41.254215 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb86749e-3ca7-473a-88ee-26e930c57552" containerName="dnsmasq-dns" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.254233 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb86749e-3ca7-473a-88ee-26e930c57552" containerName="dnsmasq-dns" Feb 19 09:06:41 crc kubenswrapper[4788]: E0219 09:06:41.254246 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9503da9-4341-4fa9-9570-d03ae14040ae" containerName="dnsmasq-dns" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.254253 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9503da9-4341-4fa9-9570-d03ae14040ae" containerName="dnsmasq-dns" Feb 19 09:06:41 crc kubenswrapper[4788]: E0219 09:06:41.254294 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9503da9-4341-4fa9-9570-d03ae14040ae" containerName="init" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.254303 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9503da9-4341-4fa9-9570-d03ae14040ae" containerName="init" Feb 19 09:06:41 crc kubenswrapper[4788]: E0219 09:06:41.254340 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb86749e-3ca7-473a-88ee-26e930c57552" containerName="init" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.254348 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb86749e-3ca7-473a-88ee-26e930c57552" containerName="init" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.254582 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb86749e-3ca7-473a-88ee-26e930c57552" containerName="dnsmasq-dns" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.254632 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9503da9-4341-4fa9-9570-d03ae14040ae" containerName="dnsmasq-dns" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.255451 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.257913 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.258149 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.258183 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.259429 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.277075 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn"] Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.426056 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56t6d\" (UniqueName: \"kubernetes.io/projected/4a6886e3-bd0f-4551-ac66-421e052315f1-kube-api-access-56t6d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.426153 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.426216 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.426394 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.528045 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.528139 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.528211 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56t6d\" (UniqueName: \"kubernetes.io/projected/4a6886e3-bd0f-4551-ac66-421e052315f1-kube-api-access-56t6d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.528263 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.534152 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.534379 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.535533 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.544691 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56t6d\" (UniqueName: \"kubernetes.io/projected/4a6886e3-bd0f-4551-ac66-421e052315f1-kube-api-access-56t6d\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:41 crc kubenswrapper[4788]: I0219 09:06:41.612281 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:06:42 crc kubenswrapper[4788]: W0219 09:06:42.257067 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6886e3_bd0f_4551_ac66_421e052315f1.slice/crio-9166a45fdb76d943080fe1da4b42e4b2987769ec847134e1b75d8e59942995cf WatchSource:0}: Error finding container 9166a45fdb76d943080fe1da4b42e4b2987769ec847134e1b75d8e59942995cf: Status 404 returned error can't find the container with id 9166a45fdb76d943080fe1da4b42e4b2987769ec847134e1b75d8e59942995cf Feb 19 09:06:42 crc kubenswrapper[4788]: I0219 09:06:42.260032 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn"] Feb 19 09:06:43 crc kubenswrapper[4788]: I0219 09:06:43.189943 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" event={"ID":"4a6886e3-bd0f-4551-ac66-421e052315f1","Type":"ContainerStarted","Data":"9166a45fdb76d943080fe1da4b42e4b2987769ec847134e1b75d8e59942995cf"} Feb 19 09:06:43 crc kubenswrapper[4788]: E0219 09:06:43.396099 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77af71c0_581a_4e58_9429_bb14901b1a1d.slice/crio-conmon-dbf6e3fadf7e39b8b236d39b4588c25ad91aa90123a170fcfc4f889c321c8070.scope\": RecentStats: unable to find data in memory cache]" Feb 19 09:06:44 crc kubenswrapper[4788]: I0219 09:06:44.200151 4788 generic.go:334] "Generic (PLEG): container finished" podID="77af71c0-581a-4e58-9429-bb14901b1a1d" containerID="dbf6e3fadf7e39b8b236d39b4588c25ad91aa90123a170fcfc4f889c321c8070" exitCode=0 Feb 19 09:06:44 crc kubenswrapper[4788]: I0219 09:06:44.200174 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"77af71c0-581a-4e58-9429-bb14901b1a1d","Type":"ContainerDied","Data":"dbf6e3fadf7e39b8b236d39b4588c25ad91aa90123a170fcfc4f889c321c8070"} Feb 19 09:06:45 crc kubenswrapper[4788]: I0219 09:06:45.211927 4788 generic.go:334] "Generic (PLEG): container finished" podID="01ace23c-c0e0-4390-85fc-1b50f8d72a66" containerID="502aad58cec5be0a8ea1ef1bc8880d48df58931c0b99293906c3454b4df6be43" exitCode=0 Feb 19 09:06:45 crc kubenswrapper[4788]: I0219 09:06:45.211984 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ace23c-c0e0-4390-85fc-1b50f8d72a66","Type":"ContainerDied","Data":"502aad58cec5be0a8ea1ef1bc8880d48df58931c0b99293906c3454b4df6be43"} Feb 19 09:06:46 crc kubenswrapper[4788]: I0219 09:06:46.226945 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"77af71c0-581a-4e58-9429-bb14901b1a1d","Type":"ContainerStarted","Data":"327df8c0708a25558705d50e6f8f6740ae1514b1755dd05377910401a951b894"} Feb 19 09:06:46 crc kubenswrapper[4788]: I0219 09:06:46.227491 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 19 09:06:46 crc kubenswrapper[4788]: I0219 09:06:46.232910 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"01ace23c-c0e0-4390-85fc-1b50f8d72a66","Type":"ContainerStarted","Data":"fd24a8ee1434386c46bcbd23ac0a73d71cfa1e83f8eefc1bb3cfece8a3e24343"} Feb 19 09:06:46 crc kubenswrapper[4788]: I0219 09:06:46.233735 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:06:46 crc kubenswrapper[4788]: I0219 09:06:46.271166 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.271135214 podStartE2EDuration="38.271135214s" podCreationTimestamp="2026-02-19 09:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:06:46.254492601 +0000 UTC m=+1308.242504143" watchObservedRunningTime="2026-02-19 09:06:46.271135214 +0000 UTC m=+1308.259146686" Feb 19 09:06:46 crc kubenswrapper[4788]: I0219 09:06:46.284912 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.284891447 podStartE2EDuration="36.284891447s" podCreationTimestamp="2026-02-19 09:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:06:46.283898363 +0000 UTC m=+1308.271909885" watchObservedRunningTime="2026-02-19 09:06:46.284891447 +0000 UTC m=+1308.272902919" Feb 19 09:06:52 crc kubenswrapper[4788]: I0219 09:06:52.139701 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:06:52 crc kubenswrapper[4788]: I0219 09:06:52.140313 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:06:59 crc kubenswrapper[4788]: I0219 09:06:59.216716 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="77af71c0-581a-4e58-9429-bb14901b1a1d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.217:5671: connect: connection refused" Feb 19 09:07:00 crc kubenswrapper[4788]: I0219 09:07:00.571614 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="01ace23c-c0e0-4390-85fc-1b50f8d72a66" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.218:5671: connect: connection refused" Feb 19 09:07:01 crc kubenswrapper[4788]: E0219 09:07:01.295811 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Feb 19 09:07:01 crc kubenswrapper[4788]: E0219 09:07:01.296058 4788 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 19 09:07:01 crc kubenswrapper[4788]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Feb 19 09:07:01 crc kubenswrapper[4788]: - hosts: all Feb 19 09:07:01 crc kubenswrapper[4788]: strategy: linear Feb 19 09:07:01 crc kubenswrapper[4788]: tasks: Feb 19 09:07:01 crc kubenswrapper[4788]: - name: Enable podified-repos Feb 19 09:07:01 crc kubenswrapper[4788]: become: true Feb 19 09:07:01 crc kubenswrapper[4788]: ansible.builtin.shell: | Feb 19 09:07:01 crc kubenswrapper[4788]: set -euxo pipefail Feb 19 09:07:01 crc kubenswrapper[4788]: pushd /var/tmp Feb 19 09:07:01 crc kubenswrapper[4788]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Feb 19 09:07:01 crc kubenswrapper[4788]: pushd repo-setup-main Feb 19 09:07:01 crc kubenswrapper[4788]: python3 -m venv ./venv Feb 19 09:07:01 crc kubenswrapper[4788]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Feb 19 09:07:01 crc kubenswrapper[4788]: ./venv/bin/repo-setup current-podified -b antelope Feb 19 09:07:01 crc kubenswrapper[4788]: popd Feb 19 09:07:01 crc kubenswrapper[4788]: rm -rf repo-setup-main Feb 19 09:07:01 crc kubenswrapper[4788]: Feb 19 09:07:01 crc kubenswrapper[4788]: Feb 19 09:07:01 crc kubenswrapper[4788]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Feb 19 09:07:01 crc kubenswrapper[4788]: edpm_override_hosts: openstack-edpm-ipam Feb 19 09:07:01 crc kubenswrapper[4788]: edpm_service_type: repo-setup Feb 19 09:07:01 crc kubenswrapper[4788]: Feb 19 09:07:01 crc kubenswrapper[4788]: Feb 19 09:07:01 crc kubenswrapper[4788]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-56t6d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn_openstack(4a6886e3-bd0f-4551-ac66-421e052315f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Feb 19 09:07:01 crc kubenswrapper[4788]: > logger="UnhandledError" Feb 19 09:07:01 crc kubenswrapper[4788]: E0219 09:07:01.297165 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" podUID="4a6886e3-bd0f-4551-ac66-421e052315f1" Feb 19 09:07:01 crc kubenswrapper[4788]: E0219 09:07:01.397086 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" podUID="4a6886e3-bd0f-4551-ac66-421e052315f1" Feb 19 09:07:09 crc kubenswrapper[4788]: I0219 09:07:09.202175 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="77af71c0-581a-4e58-9429-bb14901b1a1d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.217:5671: connect: connection refused" Feb 19 09:07:10 crc kubenswrapper[4788]: I0219 09:07:10.570645 4788 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="01ace23c-c0e0-4390-85fc-1b50f8d72a66" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.218:5671: connect: connection refused" Feb 19 09:07:17 crc kubenswrapper[4788]: I0219 09:07:17.837798 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:07:18 crc kubenswrapper[4788]: I0219 09:07:18.574719 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" event={"ID":"4a6886e3-bd0f-4551-ac66-421e052315f1","Type":"ContainerStarted","Data":"0921660478f5d582e11ae375914e2ac32f9006a48e8684a0bf54c6eea6e95cd4"} Feb 19 09:07:18 crc kubenswrapper[4788]: I0219 09:07:18.602406 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" podStartSLOduration=2.026622277 podStartE2EDuration="37.602385671s" podCreationTimestamp="2026-02-19 09:06:41 +0000 UTC" firstStartedPulling="2026-02-19 09:06:42.258702742 +0000 UTC m=+1304.246714214" lastFinishedPulling="2026-02-19 09:07:17.834466096 +0000 UTC m=+1339.822477608" observedRunningTime="2026-02-19 09:07:18.597457032 +0000 UTC m=+1340.585468524" watchObservedRunningTime="2026-02-19 09:07:18.602385671 +0000 UTC m=+1340.590397143" Feb 19 09:07:19 crc kubenswrapper[4788]: I0219 09:07:19.203460 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 19 09:07:19 crc kubenswrapper[4788]: I0219 09:07:19.397685 4788 scope.go:117] "RemoveContainer" containerID="f79f7f215457e3f62436a7880fc0f4e2d278d59ba21cabcab9023b3bc6dcf828" Feb 19 09:07:19 crc kubenswrapper[4788]: I0219 09:07:19.441152 4788 scope.go:117] "RemoveContainer" containerID="ba56eec83a04e92d2d18ed54ea13d11542ec80e3175278394d07c2eead7cfc6d" Feb 19 09:07:19 crc kubenswrapper[4788]: I0219 09:07:19.519545 4788 scope.go:117] "RemoveContainer" containerID="8c7107ad9f27bebbeef0be275cdba9100ba79e714caa256853032f1e9accd1be" Feb 19 09:07:19 crc kubenswrapper[4788]: I0219 09:07:19.663280 4788 scope.go:117] "RemoveContainer" containerID="a4cf7ab5f1bc3368868977237f9ba9aa59efe583804eaad15f3258d392e9ac5d" Feb 19 09:07:19 crc kubenswrapper[4788]: I0219 09:07:19.803804 4788 scope.go:117] "RemoveContainer" containerID="bc9dad1c8b96583b095f43327a26f5b52928020b47328c06387a4db217612f2d" Feb 19 09:07:20 crc kubenswrapper[4788]: I0219 09:07:20.571405 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 19 09:07:22 crc kubenswrapper[4788]: I0219 09:07:22.139141 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:07:22 crc kubenswrapper[4788]: I0219 09:07:22.139545 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:07:22 crc kubenswrapper[4788]: I0219 09:07:22.139601 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:07:22 crc kubenswrapper[4788]: I0219 09:07:22.140346 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"defbc637313174f365a0e7e0457f9fbe5da4bafbab9b16543dceddb4ed84fa1d"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:07:22 crc kubenswrapper[4788]: I0219 09:07:22.140420 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://defbc637313174f365a0e7e0457f9fbe5da4bafbab9b16543dceddb4ed84fa1d" gracePeriod=600 Feb 19 09:07:22 crc kubenswrapper[4788]: I0219 09:07:22.648362 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="defbc637313174f365a0e7e0457f9fbe5da4bafbab9b16543dceddb4ed84fa1d" exitCode=0 Feb 19 09:07:22 crc kubenswrapper[4788]: I0219 09:07:22.648455 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"defbc637313174f365a0e7e0457f9fbe5da4bafbab9b16543dceddb4ed84fa1d"} Feb 19 09:07:22 crc kubenswrapper[4788]: I0219 09:07:22.648648 4788 scope.go:117] "RemoveContainer" containerID="31cfa590dbe60cb7189f587f667407f74d6387f19ad0205b2e674711ceebc406" Feb 19 09:07:23 crc kubenswrapper[4788]: I0219 09:07:23.663681 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81"} Feb 19 09:07:30 crc kubenswrapper[4788]: I0219 09:07:30.784635 4788 generic.go:334] "Generic (PLEG): container finished" podID="4a6886e3-bd0f-4551-ac66-421e052315f1" containerID="0921660478f5d582e11ae375914e2ac32f9006a48e8684a0bf54c6eea6e95cd4" exitCode=0 Feb 19 09:07:30 crc kubenswrapper[4788]: I0219 09:07:30.784718 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" event={"ID":"4a6886e3-bd0f-4551-ac66-421e052315f1","Type":"ContainerDied","Data":"0921660478f5d582e11ae375914e2ac32f9006a48e8684a0bf54c6eea6e95cd4"} Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.277929 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.369441 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-ssh-key-openstack-edpm-ipam\") pod \"4a6886e3-bd0f-4551-ac66-421e052315f1\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.369530 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-repo-setup-combined-ca-bundle\") pod \"4a6886e3-bd0f-4551-ac66-421e052315f1\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.370832 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56t6d\" (UniqueName: \"kubernetes.io/projected/4a6886e3-bd0f-4551-ac66-421e052315f1-kube-api-access-56t6d\") pod \"4a6886e3-bd0f-4551-ac66-421e052315f1\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.371024 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-inventory\") pod \"4a6886e3-bd0f-4551-ac66-421e052315f1\" (UID: \"4a6886e3-bd0f-4551-ac66-421e052315f1\") " Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.376593 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6886e3-bd0f-4551-ac66-421e052315f1-kube-api-access-56t6d" (OuterVolumeSpecName: "kube-api-access-56t6d") pod "4a6886e3-bd0f-4551-ac66-421e052315f1" (UID: "4a6886e3-bd0f-4551-ac66-421e052315f1"). InnerVolumeSpecName "kube-api-access-56t6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.377069 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4a6886e3-bd0f-4551-ac66-421e052315f1" (UID: "4a6886e3-bd0f-4551-ac66-421e052315f1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.407202 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-inventory" (OuterVolumeSpecName: "inventory") pod "4a6886e3-bd0f-4551-ac66-421e052315f1" (UID: "4a6886e3-bd0f-4551-ac66-421e052315f1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.407872 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4a6886e3-bd0f-4551-ac66-421e052315f1" (UID: "4a6886e3-bd0f-4551-ac66-421e052315f1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.474675 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.474739 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.474761 4788 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6886e3-bd0f-4551-ac66-421e052315f1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.474780 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56t6d\" (UniqueName: \"kubernetes.io/projected/4a6886e3-bd0f-4551-ac66-421e052315f1-kube-api-access-56t6d\") on node \"crc\" DevicePath \"\"" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.809430 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" event={"ID":"4a6886e3-bd0f-4551-ac66-421e052315f1","Type":"ContainerDied","Data":"9166a45fdb76d943080fe1da4b42e4b2987769ec847134e1b75d8e59942995cf"} Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.809505 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9166a45fdb76d943080fe1da4b42e4b2987769ec847134e1b75d8e59942995cf" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.809596 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.913463 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l"] Feb 19 09:07:32 crc kubenswrapper[4788]: E0219 09:07:32.914094 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6886e3-bd0f-4551-ac66-421e052315f1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.914185 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6886e3-bd0f-4551-ac66-421e052315f1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.914492 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6886e3-bd0f-4551-ac66-421e052315f1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.915332 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.921139 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.921489 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.922138 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.922773 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:07:32 crc kubenswrapper[4788]: I0219 09:07:32.931100 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l"] Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.087579 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gxh4\" (UniqueName: \"kubernetes.io/projected/e9459589-7001-4a8f-ac0c-6dae0ce143bf-kube-api-access-8gxh4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.087701 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.087757 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.190113 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gxh4\" (UniqueName: \"kubernetes.io/projected/e9459589-7001-4a8f-ac0c-6dae0ce143bf-kube-api-access-8gxh4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.190326 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.190391 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.198573 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.198678 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.214819 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gxh4\" (UniqueName: \"kubernetes.io/projected/e9459589-7001-4a8f-ac0c-6dae0ce143bf-kube-api-access-8gxh4\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-sqp6l\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.265650 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:33 crc kubenswrapper[4788]: I0219 09:07:33.938429 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l"] Feb 19 09:07:33 crc kubenswrapper[4788]: W0219 09:07:33.939598 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9459589_7001_4a8f_ac0c_6dae0ce143bf.slice/crio-4c06e833b6a0c0c50b479ad3c8269a55ce4ad25c2e69efd4583fbb6eb1e20f0e WatchSource:0}: Error finding container 4c06e833b6a0c0c50b479ad3c8269a55ce4ad25c2e69efd4583fbb6eb1e20f0e: Status 404 returned error can't find the container with id 4c06e833b6a0c0c50b479ad3c8269a55ce4ad25c2e69efd4583fbb6eb1e20f0e Feb 19 09:07:34 crc kubenswrapper[4788]: I0219 09:07:34.833371 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" event={"ID":"e9459589-7001-4a8f-ac0c-6dae0ce143bf","Type":"ContainerStarted","Data":"4c06e833b6a0c0c50b479ad3c8269a55ce4ad25c2e69efd4583fbb6eb1e20f0e"} Feb 19 09:07:36 crc kubenswrapper[4788]: I0219 09:07:36.854092 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" event={"ID":"e9459589-7001-4a8f-ac0c-6dae0ce143bf","Type":"ContainerStarted","Data":"63e02b95ed3547f5f1211b693124af7a04b222946501353a8ee8f574472cf0b4"} Feb 19 09:07:36 crc kubenswrapper[4788]: I0219 09:07:36.880784 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" podStartSLOduration=3.279381005 podStartE2EDuration="4.880761423s" podCreationTimestamp="2026-02-19 09:07:32 +0000 UTC" firstStartedPulling="2026-02-19 09:07:33.942624079 +0000 UTC m=+1355.930635551" lastFinishedPulling="2026-02-19 09:07:35.544004497 +0000 UTC m=+1357.532015969" observedRunningTime="2026-02-19 09:07:36.879461292 +0000 UTC m=+1358.867472774" watchObservedRunningTime="2026-02-19 09:07:36.880761423 +0000 UTC m=+1358.868772925" Feb 19 09:07:39 crc kubenswrapper[4788]: I0219 09:07:39.888410 4788 generic.go:334] "Generic (PLEG): container finished" podID="e9459589-7001-4a8f-ac0c-6dae0ce143bf" containerID="63e02b95ed3547f5f1211b693124af7a04b222946501353a8ee8f574472cf0b4" exitCode=0 Feb 19 09:07:39 crc kubenswrapper[4788]: I0219 09:07:39.888493 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" event={"ID":"e9459589-7001-4a8f-ac0c-6dae0ce143bf","Type":"ContainerDied","Data":"63e02b95ed3547f5f1211b693124af7a04b222946501353a8ee8f574472cf0b4"} Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.380913 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.453206 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-inventory\") pod \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.453298 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-ssh-key-openstack-edpm-ipam\") pod \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.453445 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gxh4\" (UniqueName: \"kubernetes.io/projected/e9459589-7001-4a8f-ac0c-6dae0ce143bf-kube-api-access-8gxh4\") pod \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\" (UID: \"e9459589-7001-4a8f-ac0c-6dae0ce143bf\") " Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.461688 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9459589-7001-4a8f-ac0c-6dae0ce143bf-kube-api-access-8gxh4" (OuterVolumeSpecName: "kube-api-access-8gxh4") pod "e9459589-7001-4a8f-ac0c-6dae0ce143bf" (UID: "e9459589-7001-4a8f-ac0c-6dae0ce143bf"). InnerVolumeSpecName "kube-api-access-8gxh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.494064 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9459589-7001-4a8f-ac0c-6dae0ce143bf" (UID: "e9459589-7001-4a8f-ac0c-6dae0ce143bf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.497968 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-inventory" (OuterVolumeSpecName: "inventory") pod "e9459589-7001-4a8f-ac0c-6dae0ce143bf" (UID: "e9459589-7001-4a8f-ac0c-6dae0ce143bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.555770 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gxh4\" (UniqueName: \"kubernetes.io/projected/e9459589-7001-4a8f-ac0c-6dae0ce143bf-kube-api-access-8gxh4\") on node \"crc\" DevicePath \"\"" Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.555820 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.555837 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9459589-7001-4a8f-ac0c-6dae0ce143bf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.917486 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" event={"ID":"e9459589-7001-4a8f-ac0c-6dae0ce143bf","Type":"ContainerDied","Data":"4c06e833b6a0c0c50b479ad3c8269a55ce4ad25c2e69efd4583fbb6eb1e20f0e"} Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.917571 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c06e833b6a0c0c50b479ad3c8269a55ce4ad25c2e69efd4583fbb6eb1e20f0e" Feb 19 09:07:41 crc kubenswrapper[4788]: I0219 09:07:41.917704 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-sqp6l" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.011028 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49"] Feb 19 09:07:42 crc kubenswrapper[4788]: E0219 09:07:42.011463 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9459589-7001-4a8f-ac0c-6dae0ce143bf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.011481 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9459589-7001-4a8f-ac0c-6dae0ce143bf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.011663 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9459589-7001-4a8f-ac0c-6dae0ce143bf" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.012275 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.015436 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.016894 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.016910 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.017164 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.027430 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49"] Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.066280 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df59g\" (UniqueName: \"kubernetes.io/projected/ac977ac7-d7dd-4af4-a079-dbcadde95e32-kube-api-access-df59g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.066370 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.066404 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.066487 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.168024 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.168138 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df59g\" (UniqueName: \"kubernetes.io/projected/ac977ac7-d7dd-4af4-a079-dbcadde95e32-kube-api-access-df59g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.168192 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.168223 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.175110 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.175332 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.176503 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.187698 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df59g\" (UniqueName: \"kubernetes.io/projected/ac977ac7-d7dd-4af4-a079-dbcadde95e32-kube-api-access-df59g\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.348963 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:07:42 crc kubenswrapper[4788]: I0219 09:07:42.932070 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49"] Feb 19 09:07:43 crc kubenswrapper[4788]: I0219 09:07:43.950684 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" event={"ID":"ac977ac7-d7dd-4af4-a079-dbcadde95e32","Type":"ContainerStarted","Data":"b5e7480c86729a327a0075ac758075c573892c8830851887876da04f95cff348"} Feb 19 09:07:45 crc kubenswrapper[4788]: I0219 09:07:45.979887 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" event={"ID":"ac977ac7-d7dd-4af4-a079-dbcadde95e32","Type":"ContainerStarted","Data":"1e71949eea5cd3d5a5d8025f53fecdb194a272fdfc9bdf5620928e946d9adaf2"} Feb 19 09:07:46 crc kubenswrapper[4788]: I0219 09:07:46.012175 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" podStartSLOduration=3.085189617 podStartE2EDuration="5.012150838s" podCreationTimestamp="2026-02-19 09:07:41 +0000 UTC" firstStartedPulling="2026-02-19 09:07:42.925373825 +0000 UTC m=+1364.913385297" lastFinishedPulling="2026-02-19 09:07:44.852335016 +0000 UTC m=+1366.840346518" observedRunningTime="2026-02-19 09:07:45.999560409 +0000 UTC m=+1367.987571941" watchObservedRunningTime="2026-02-19 09:07:46.012150838 +0000 UTC m=+1368.000162310" Feb 19 09:08:20 crc kubenswrapper[4788]: I0219 09:08:20.191665 4788 scope.go:117] "RemoveContainer" containerID="d4da65a43c21e2c1ef2ae88da20296925ef9afa26feac2315d52eecff708f338" Feb 19 09:09:52 crc kubenswrapper[4788]: I0219 09:09:52.138961 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:09:52 crc kubenswrapper[4788]: I0219 09:09:52.139638 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:10:22 crc kubenswrapper[4788]: I0219 09:10:22.139795 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:10:22 crc kubenswrapper[4788]: I0219 09:10:22.140460 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:10:28 crc kubenswrapper[4788]: I0219 09:10:28.909693 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6jfcf"] Feb 19 09:10:28 crc kubenswrapper[4788]: I0219 09:10:28.912366 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:28 crc kubenswrapper[4788]: I0219 09:10:28.922022 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6jfcf"] Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.016175 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-utilities\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.016303 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdvv\" (UniqueName: \"kubernetes.io/projected/562a8331-2771-4300-9d37-a0db3cac89a6-kube-api-access-dhdvv\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.016437 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-catalog-content\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.118054 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdvv\" (UniqueName: \"kubernetes.io/projected/562a8331-2771-4300-9d37-a0db3cac89a6-kube-api-access-dhdvv\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.118631 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-catalog-content\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.119150 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-catalog-content\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.119296 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-utilities\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.119587 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-utilities\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.141478 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdvv\" (UniqueName: \"kubernetes.io/projected/562a8331-2771-4300-9d37-a0db3cac89a6-kube-api-access-dhdvv\") pod \"community-operators-6jfcf\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.233950 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:29 crc kubenswrapper[4788]: I0219 09:10:29.739821 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6jfcf"] Feb 19 09:10:30 crc kubenswrapper[4788]: I0219 09:10:30.071227 4788 generic.go:334] "Generic (PLEG): container finished" podID="562a8331-2771-4300-9d37-a0db3cac89a6" containerID="146227fa0b031a26677c3ec8ed931973b25e2f7f906366139f0abbb5d59edb82" exitCode=0 Feb 19 09:10:30 crc kubenswrapper[4788]: I0219 09:10:30.071341 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jfcf" event={"ID":"562a8331-2771-4300-9d37-a0db3cac89a6","Type":"ContainerDied","Data":"146227fa0b031a26677c3ec8ed931973b25e2f7f906366139f0abbb5d59edb82"} Feb 19 09:10:30 crc kubenswrapper[4788]: I0219 09:10:30.071690 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jfcf" event={"ID":"562a8331-2771-4300-9d37-a0db3cac89a6","Type":"ContainerStarted","Data":"f6e09c8910ba3062db06f962e02f0e74f69ef3df486dde45db6e16353e0401b3"} Feb 19 09:10:30 crc kubenswrapper[4788]: I0219 09:10:30.074589 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:10:32 crc kubenswrapper[4788]: I0219 09:10:32.097567 4788 generic.go:334] "Generic (PLEG): container finished" podID="ac977ac7-d7dd-4af4-a079-dbcadde95e32" containerID="1e71949eea5cd3d5a5d8025f53fecdb194a272fdfc9bdf5620928e946d9adaf2" exitCode=0 Feb 19 09:10:32 crc kubenswrapper[4788]: I0219 09:10:32.097807 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" event={"ID":"ac977ac7-d7dd-4af4-a079-dbcadde95e32","Type":"ContainerDied","Data":"1e71949eea5cd3d5a5d8025f53fecdb194a272fdfc9bdf5620928e946d9adaf2"} Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.108886 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jfcf" event={"ID":"562a8331-2771-4300-9d37-a0db3cac89a6","Type":"ContainerStarted","Data":"61b0e99de294027d2e712770f9be71573fc2cfae8b7f185e2467f10f9a362813"} Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.612401 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.722654 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-inventory\") pod \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.722726 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-bootstrap-combined-ca-bundle\") pod \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.722863 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-df59g\" (UniqueName: \"kubernetes.io/projected/ac977ac7-d7dd-4af4-a079-dbcadde95e32-kube-api-access-df59g\") pod \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.723011 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-ssh-key-openstack-edpm-ipam\") pod \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\" (UID: \"ac977ac7-d7dd-4af4-a079-dbcadde95e32\") " Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.731647 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac977ac7-d7dd-4af4-a079-dbcadde95e32-kube-api-access-df59g" (OuterVolumeSpecName: "kube-api-access-df59g") pod "ac977ac7-d7dd-4af4-a079-dbcadde95e32" (UID: "ac977ac7-d7dd-4af4-a079-dbcadde95e32"). InnerVolumeSpecName "kube-api-access-df59g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.731640 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ac977ac7-d7dd-4af4-a079-dbcadde95e32" (UID: "ac977ac7-d7dd-4af4-a079-dbcadde95e32"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.764558 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-inventory" (OuterVolumeSpecName: "inventory") pod "ac977ac7-d7dd-4af4-a079-dbcadde95e32" (UID: "ac977ac7-d7dd-4af4-a079-dbcadde95e32"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.775605 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac977ac7-d7dd-4af4-a079-dbcadde95e32" (UID: "ac977ac7-d7dd-4af4-a079-dbcadde95e32"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.825625 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.825866 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.825931 4788 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac977ac7-d7dd-4af4-a079-dbcadde95e32-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:10:33 crc kubenswrapper[4788]: I0219 09:10:33.825985 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-df59g\" (UniqueName: \"kubernetes.io/projected/ac977ac7-d7dd-4af4-a079-dbcadde95e32-kube-api-access-df59g\") on node \"crc\" DevicePath \"\"" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.124282 4788 generic.go:334] "Generic (PLEG): container finished" podID="562a8331-2771-4300-9d37-a0db3cac89a6" containerID="61b0e99de294027d2e712770f9be71573fc2cfae8b7f185e2467f10f9a362813" exitCode=0 Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.124349 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jfcf" event={"ID":"562a8331-2771-4300-9d37-a0db3cac89a6","Type":"ContainerDied","Data":"61b0e99de294027d2e712770f9be71573fc2cfae8b7f185e2467f10f9a362813"} Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.126781 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" event={"ID":"ac977ac7-d7dd-4af4-a079-dbcadde95e32","Type":"ContainerDied","Data":"b5e7480c86729a327a0075ac758075c573892c8830851887876da04f95cff348"} Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.126822 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5e7480c86729a327a0075ac758075c573892c8830851887876da04f95cff348" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.126888 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.243900 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk"] Feb 19 09:10:34 crc kubenswrapper[4788]: E0219 09:10:34.245314 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac977ac7-d7dd-4af4-a079-dbcadde95e32" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.245341 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac977ac7-d7dd-4af4-a079-dbcadde95e32" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.246078 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac977ac7-d7dd-4af4-a079-dbcadde95e32" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.247283 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.249712 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.250840 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.250875 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.251322 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.283295 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk"] Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.339601 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.339826 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.339905 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qcrz\" (UniqueName: \"kubernetes.io/projected/952e8fd1-9634-4af1-8f56-62068214b66c-kube-api-access-6qcrz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.442337 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qcrz\" (UniqueName: \"kubernetes.io/projected/952e8fd1-9634-4af1-8f56-62068214b66c-kube-api-access-6qcrz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.442484 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.442733 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.451503 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.457512 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.463144 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qcrz\" (UniqueName: \"kubernetes.io/projected/952e8fd1-9634-4af1-8f56-62068214b66c-kube-api-access-6qcrz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:34 crc kubenswrapper[4788]: I0219 09:10:34.578509 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:10:35 crc kubenswrapper[4788]: I0219 09:10:35.277713 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk"] Feb 19 09:10:35 crc kubenswrapper[4788]: W0219 09:10:35.281978 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod952e8fd1_9634_4af1_8f56_62068214b66c.slice/crio-c34a2f8c9b6c24362ea924d33a45332b6732ad645201a7e699d7612ef9e3939f WatchSource:0}: Error finding container c34a2f8c9b6c24362ea924d33a45332b6732ad645201a7e699d7612ef9e3939f: Status 404 returned error can't find the container with id c34a2f8c9b6c24362ea924d33a45332b6732ad645201a7e699d7612ef9e3939f Feb 19 09:10:36 crc kubenswrapper[4788]: I0219 09:10:36.160741 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" event={"ID":"952e8fd1-9634-4af1-8f56-62068214b66c","Type":"ContainerStarted","Data":"c34a2f8c9b6c24362ea924d33a45332b6732ad645201a7e699d7612ef9e3939f"} Feb 19 09:10:41 crc kubenswrapper[4788]: I0219 09:10:41.217852 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jfcf" event={"ID":"562a8331-2771-4300-9d37-a0db3cac89a6","Type":"ContainerStarted","Data":"dad05abfac3ee7bc149ea460d1bcaeb0b7b9f1664655512277c36fd62ec2b499"} Feb 19 09:10:41 crc kubenswrapper[4788]: I0219 09:10:41.249904 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6jfcf" podStartSLOduration=2.615967228 podStartE2EDuration="13.249884726s" podCreationTimestamp="2026-02-19 09:10:28 +0000 UTC" firstStartedPulling="2026-02-19 09:10:30.074263766 +0000 UTC m=+1532.062275238" lastFinishedPulling="2026-02-19 09:10:40.708181264 +0000 UTC m=+1542.696192736" observedRunningTime="2026-02-19 09:10:41.245904598 +0000 UTC m=+1543.233916080" watchObservedRunningTime="2026-02-19 09:10:41.249884726 +0000 UTC m=+1543.237896198" Feb 19 09:10:42 crc kubenswrapper[4788]: I0219 09:10:42.230752 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" event={"ID":"952e8fd1-9634-4af1-8f56-62068214b66c","Type":"ContainerStarted","Data":"628c5e93adf4cbf402ad8b15f4caade4fe8cb7eb93884155a62483a09908213a"} Feb 19 09:10:42 crc kubenswrapper[4788]: I0219 09:10:42.249971 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" podStartSLOduration=2.326027809 podStartE2EDuration="8.249940036s" podCreationTimestamp="2026-02-19 09:10:34 +0000 UTC" firstStartedPulling="2026-02-19 09:10:35.284705603 +0000 UTC m=+1537.272717085" lastFinishedPulling="2026-02-19 09:10:41.20861784 +0000 UTC m=+1543.196629312" observedRunningTime="2026-02-19 09:10:42.249917296 +0000 UTC m=+1544.237928798" watchObservedRunningTime="2026-02-19 09:10:42.249940036 +0000 UTC m=+1544.237951518" Feb 19 09:10:45 crc kubenswrapper[4788]: I0219 09:10:45.052692 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8tsc9"] Feb 19 09:10:45 crc kubenswrapper[4788]: I0219 09:10:45.060564 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8tsc9"] Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.043170 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-4f7d-account-create-update-jpgd6"] Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.056925 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ab60-account-create-update-755dc"] Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.074393 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-2v24m"] Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.087910 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-4f7d-account-create-update-jpgd6"] Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.116627 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ab60-account-create-update-755dc"] Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.134148 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-2v24m"] Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.724900 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c57601-e6f1-4092-9d4a-49e8c5cf38e3" path="/var/lib/kubelet/pods/72c57601-e6f1-4092-9d4a-49e8c5cf38e3/volumes" Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.725576 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="893edbf1-7994-499b-bd8e-45d7f3e9eb5f" path="/var/lib/kubelet/pods/893edbf1-7994-499b-bd8e-45d7f3e9eb5f/volumes" Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.726149 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b" path="/var/lib/kubelet/pods/d2c80d7c-8af0-47ae-99d2-ec76f4a54d5b/volumes" Feb 19 09:10:46 crc kubenswrapper[4788]: I0219 09:10:46.726696 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7aae0ad-43eb-43dd-b926-6b0847fa9eea" path="/var/lib/kubelet/pods/d7aae0ad-43eb-43dd-b926-6b0847fa9eea/volumes" Feb 19 09:10:49 crc kubenswrapper[4788]: I0219 09:10:49.235075 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:49 crc kubenswrapper[4788]: I0219 09:10:49.236202 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:10:50 crc kubenswrapper[4788]: I0219 09:10:50.298445 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6jfcf" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="registry-server" probeResult="failure" output=< Feb 19 09:10:50 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:10:50 crc kubenswrapper[4788]: > Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.111175 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wg94t"] Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.113486 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.133033 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wg94t"] Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.182193 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-utilities\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.182271 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-catalog-content\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.182295 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5vtf\" (UniqueName: \"kubernetes.io/projected/c23e4879-fde0-43bf-93c5-a1194d39ea27-kube-api-access-t5vtf\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.286659 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-utilities\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.287136 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-utilities\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.287238 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-catalog-content\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.287396 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-catalog-content\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.287423 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5vtf\" (UniqueName: \"kubernetes.io/projected/c23e4879-fde0-43bf-93c5-a1194d39ea27-kube-api-access-t5vtf\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.327518 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5vtf\" (UniqueName: \"kubernetes.io/projected/c23e4879-fde0-43bf-93c5-a1194d39ea27-kube-api-access-t5vtf\") pod \"certified-operators-wg94t\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.446542 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:10:51 crc kubenswrapper[4788]: I0219 09:10:51.953955 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wg94t"] Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.139910 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.139982 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.140051 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.140944 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.141013 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" gracePeriod=600 Feb 19 09:10:52 crc kubenswrapper[4788]: E0219 09:10:52.297655 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.339090 4788 generic.go:334] "Generic (PLEG): container finished" podID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerID="d2c5167064437ae34ed61c8e24fdfca9fcdbb450b58ec22a484d62448d575909" exitCode=0 Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.339238 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg94t" event={"ID":"c23e4879-fde0-43bf-93c5-a1194d39ea27","Type":"ContainerDied","Data":"d2c5167064437ae34ed61c8e24fdfca9fcdbb450b58ec22a484d62448d575909"} Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.340343 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg94t" event={"ID":"c23e4879-fde0-43bf-93c5-a1194d39ea27","Type":"ContainerStarted","Data":"53d4ed50eb5d754b961eb1c3864be831dc01e1a83472d3e6e8e9b16c24d6f526"} Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.343388 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" exitCode=0 Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.343444 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81"} Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.343475 4788 scope.go:117] "RemoveContainer" containerID="defbc637313174f365a0e7e0457f9fbe5da4bafbab9b16543dceddb4ed84fa1d" Feb 19 09:10:52 crc kubenswrapper[4788]: I0219 09:10:52.344465 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:10:52 crc kubenswrapper[4788]: E0219 09:10:52.344897 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:10:54 crc kubenswrapper[4788]: I0219 09:10:54.049412 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1d7e-account-create-update-bhj4l"] Feb 19 09:10:54 crc kubenswrapper[4788]: I0219 09:10:54.062421 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-q8z7r"] Feb 19 09:10:54 crc kubenswrapper[4788]: I0219 09:10:54.073186 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1d7e-account-create-update-bhj4l"] Feb 19 09:10:54 crc kubenswrapper[4788]: I0219 09:10:54.084102 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-q8z7r"] Feb 19 09:10:54 crc kubenswrapper[4788]: I0219 09:10:54.373409 4788 generic.go:334] "Generic (PLEG): container finished" podID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerID="6936dd575e2de31438bd3d20258de1db870952cf5817141bc86647cda84b8b5e" exitCode=0 Feb 19 09:10:54 crc kubenswrapper[4788]: I0219 09:10:54.373516 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg94t" event={"ID":"c23e4879-fde0-43bf-93c5-a1194d39ea27","Type":"ContainerDied","Data":"6936dd575e2de31438bd3d20258de1db870952cf5817141bc86647cda84b8b5e"} Feb 19 09:10:54 crc kubenswrapper[4788]: I0219 09:10:54.727211 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73420a48-bf28-40cc-b232-fab14ef5745e" path="/var/lib/kubelet/pods/73420a48-bf28-40cc-b232-fab14ef5745e/volumes" Feb 19 09:10:54 crc kubenswrapper[4788]: I0219 09:10:54.728088 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e0aea8-b2f5-42f4-ab90-77423a7832ce" path="/var/lib/kubelet/pods/f1e0aea8-b2f5-42f4-ab90-77423a7832ce/volumes" Feb 19 09:10:58 crc kubenswrapper[4788]: I0219 09:10:58.416228 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg94t" event={"ID":"c23e4879-fde0-43bf-93c5-a1194d39ea27","Type":"ContainerStarted","Data":"86a34cdbd701cb26cfbfe885a0e8e55ae375ba95e8b484a66a4191e1185117de"} Feb 19 09:10:58 crc kubenswrapper[4788]: I0219 09:10:58.436588 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wg94t" podStartSLOduration=2.816201364 podStartE2EDuration="7.436561595s" podCreationTimestamp="2026-02-19 09:10:51 +0000 UTC" firstStartedPulling="2026-02-19 09:10:52.341239695 +0000 UTC m=+1554.329251167" lastFinishedPulling="2026-02-19 09:10:56.961599916 +0000 UTC m=+1558.949611398" observedRunningTime="2026-02-19 09:10:58.433712873 +0000 UTC m=+1560.421724345" watchObservedRunningTime="2026-02-19 09:10:58.436561595 +0000 UTC m=+1560.424573067" Feb 19 09:11:00 crc kubenswrapper[4788]: I0219 09:11:00.287534 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6jfcf" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="registry-server" probeResult="failure" output=< Feb 19 09:11:00 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:11:00 crc kubenswrapper[4788]: > Feb 19 09:11:01 crc kubenswrapper[4788]: I0219 09:11:01.447136 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:11:01 crc kubenswrapper[4788]: I0219 09:11:01.447698 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:11:01 crc kubenswrapper[4788]: I0219 09:11:01.508075 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:11:02 crc kubenswrapper[4788]: I0219 09:11:02.533608 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:11:02 crc kubenswrapper[4788]: I0219 09:11:02.606141 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wg94t"] Feb 19 09:11:04 crc kubenswrapper[4788]: I0219 09:11:04.486363 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wg94t" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerName="registry-server" containerID="cri-o://86a34cdbd701cb26cfbfe885a0e8e55ae375ba95e8b484a66a4191e1185117de" gracePeriod=2 Feb 19 09:11:04 crc kubenswrapper[4788]: I0219 09:11:04.714314 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:11:04 crc kubenswrapper[4788]: E0219 09:11:04.714606 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.499883 4788 generic.go:334] "Generic (PLEG): container finished" podID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerID="86a34cdbd701cb26cfbfe885a0e8e55ae375ba95e8b484a66a4191e1185117de" exitCode=0 Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.499969 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg94t" event={"ID":"c23e4879-fde0-43bf-93c5-a1194d39ea27","Type":"ContainerDied","Data":"86a34cdbd701cb26cfbfe885a0e8e55ae375ba95e8b484a66a4191e1185117de"} Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.500401 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wg94t" event={"ID":"c23e4879-fde0-43bf-93c5-a1194d39ea27","Type":"ContainerDied","Data":"53d4ed50eb5d754b961eb1c3864be831dc01e1a83472d3e6e8e9b16c24d6f526"} Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.500442 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53d4ed50eb5d754b961eb1c3864be831dc01e1a83472d3e6e8e9b16c24d6f526" Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.553942 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.701045 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-utilities\") pod \"c23e4879-fde0-43bf-93c5-a1194d39ea27\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.701108 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5vtf\" (UniqueName: \"kubernetes.io/projected/c23e4879-fde0-43bf-93c5-a1194d39ea27-kube-api-access-t5vtf\") pod \"c23e4879-fde0-43bf-93c5-a1194d39ea27\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.701362 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-catalog-content\") pod \"c23e4879-fde0-43bf-93c5-a1194d39ea27\" (UID: \"c23e4879-fde0-43bf-93c5-a1194d39ea27\") " Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.703079 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-utilities" (OuterVolumeSpecName: "utilities") pod "c23e4879-fde0-43bf-93c5-a1194d39ea27" (UID: "c23e4879-fde0-43bf-93c5-a1194d39ea27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.704990 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.709843 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c23e4879-fde0-43bf-93c5-a1194d39ea27-kube-api-access-t5vtf" (OuterVolumeSpecName: "kube-api-access-t5vtf") pod "c23e4879-fde0-43bf-93c5-a1194d39ea27" (UID: "c23e4879-fde0-43bf-93c5-a1194d39ea27"). InnerVolumeSpecName "kube-api-access-t5vtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.771644 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c23e4879-fde0-43bf-93c5-a1194d39ea27" (UID: "c23e4879-fde0-43bf-93c5-a1194d39ea27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.809809 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5vtf\" (UniqueName: \"kubernetes.io/projected/c23e4879-fde0-43bf-93c5-a1194d39ea27-kube-api-access-t5vtf\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:05 crc kubenswrapper[4788]: I0219 09:11:05.809852 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c23e4879-fde0-43bf-93c5-a1194d39ea27-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:06 crc kubenswrapper[4788]: I0219 09:11:06.510338 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wg94t" Feb 19 09:11:06 crc kubenswrapper[4788]: I0219 09:11:06.567394 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wg94t"] Feb 19 09:11:06 crc kubenswrapper[4788]: I0219 09:11:06.580699 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wg94t"] Feb 19 09:11:06 crc kubenswrapper[4788]: I0219 09:11:06.728093 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" path="/var/lib/kubelet/pods/c23e4879-fde0-43bf-93c5-a1194d39ea27/volumes" Feb 19 09:11:10 crc kubenswrapper[4788]: I0219 09:11:10.283262 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6jfcf" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="registry-server" probeResult="failure" output=< Feb 19 09:11:10 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:11:10 crc kubenswrapper[4788]: > Feb 19 09:11:13 crc kubenswrapper[4788]: I0219 09:11:13.048981 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2qxmb"] Feb 19 09:11:13 crc kubenswrapper[4788]: I0219 09:11:13.055908 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2qxmb"] Feb 19 09:11:14 crc kubenswrapper[4788]: I0219 09:11:14.727845 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd64b1ce-8564-4921-b382-8adf535a61a4" path="/var/lib/kubelet/pods/cd64b1ce-8564-4921-b382-8adf535a61a4/volumes" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.325340 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-whbtx"] Feb 19 09:11:16 crc kubenswrapper[4788]: E0219 09:11:16.325711 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerName="registry-server" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.325723 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerName="registry-server" Feb 19 09:11:16 crc kubenswrapper[4788]: E0219 09:11:16.325748 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerName="extract-content" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.325754 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerName="extract-content" Feb 19 09:11:16 crc kubenswrapper[4788]: E0219 09:11:16.325767 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerName="extract-utilities" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.325773 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerName="extract-utilities" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.325930 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c23e4879-fde0-43bf-93c5-a1194d39ea27" containerName="registry-server" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.327316 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.362965 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-whbtx"] Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.437727 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-catalog-content\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.438085 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmcgk\" (UniqueName: \"kubernetes.io/projected/fc527cdf-b038-4c50-903f-816dd46ef453-kube-api-access-zmcgk\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.438222 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-utilities\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.541222 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-catalog-content\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.541354 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmcgk\" (UniqueName: \"kubernetes.io/projected/fc527cdf-b038-4c50-903f-816dd46ef453-kube-api-access-zmcgk\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.541392 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-utilities\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.542347 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-catalog-content\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.573545 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmcgk\" (UniqueName: \"kubernetes.io/projected/fc527cdf-b038-4c50-903f-816dd46ef453-kube-api-access-zmcgk\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.625380 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-utilities\") pod \"redhat-operators-whbtx\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.654310 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:16 crc kubenswrapper[4788]: I0219 09:11:16.719507 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:11:16 crc kubenswrapper[4788]: E0219 09:11:16.719763 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:11:17 crc kubenswrapper[4788]: I0219 09:11:17.140724 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-whbtx"] Feb 19 09:11:17 crc kubenswrapper[4788]: I0219 09:11:17.625820 4788 generic.go:334] "Generic (PLEG): container finished" podID="fc527cdf-b038-4c50-903f-816dd46ef453" containerID="2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a" exitCode=0 Feb 19 09:11:17 crc kubenswrapper[4788]: I0219 09:11:17.625917 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whbtx" event={"ID":"fc527cdf-b038-4c50-903f-816dd46ef453","Type":"ContainerDied","Data":"2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a"} Feb 19 09:11:17 crc kubenswrapper[4788]: I0219 09:11:17.625990 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whbtx" event={"ID":"fc527cdf-b038-4c50-903f-816dd46ef453","Type":"ContainerStarted","Data":"b2b88fc9e0c8848f56f3d79eb5b6d4644690b71ce0313c4cbec105a43487d040"} Feb 19 09:11:19 crc kubenswrapper[4788]: I0219 09:11:19.287721 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:11:19 crc kubenswrapper[4788]: I0219 09:11:19.331223 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:11:19 crc kubenswrapper[4788]: I0219 09:11:19.645363 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whbtx" event={"ID":"fc527cdf-b038-4c50-903f-816dd46ef453","Type":"ContainerStarted","Data":"71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b"} Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.299741 4788 scope.go:117] "RemoveContainer" containerID="72a01b937262c26236c20bc0835900b832146b77519d07b8221c1e916fd1b451" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.533598 4788 scope.go:117] "RemoveContainer" containerID="d35817ec6d70bb48295ca866918e33dd9f2c28cfe5e68556e823c24fd245486d" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.564972 4788 scope.go:117] "RemoveContainer" containerID="95bd2fff0c8ec80293c2bfe9fbace3c8a5166adbfd0953574ef94264e9325d3c" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.637469 4788 scope.go:117] "RemoveContainer" containerID="91dca1f13e60bcae0ab1d1fb4d153cbe06f24e26897618a01929ca3b69b758ab" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.657327 4788 generic.go:334] "Generic (PLEG): container finished" podID="fc527cdf-b038-4c50-903f-816dd46ef453" containerID="71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b" exitCode=0 Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.657386 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whbtx" event={"ID":"fc527cdf-b038-4c50-903f-816dd46ef453","Type":"ContainerDied","Data":"71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b"} Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.680814 4788 scope.go:117] "RemoveContainer" containerID="521a7ea9f60cd14ae42a92c828099e7e5154afe94d17ea85ffde3e2c24c46301" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.710004 4788 scope.go:117] "RemoveContainer" containerID="03c010f3555fa97f65bbdac9cde71ea55af050dd3e8b5db3087389c44348646a" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.807595 4788 scope.go:117] "RemoveContainer" containerID="c70831671c343fd180d6e976569a54d0b4c7b0b195a1443f22279f4aff8d858e" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.853945 4788 scope.go:117] "RemoveContainer" containerID="8048e78cd541778c58128d7cd30d64a40746f152fbf66e6a830537fc96bfadbf" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.878399 4788 scope.go:117] "RemoveContainer" containerID="3f8e6b280343f0d0df0929ee8129368e714260c596f952098114b71dc2a4f655" Feb 19 09:11:20 crc kubenswrapper[4788]: I0219 09:11:20.952451 4788 scope.go:117] "RemoveContainer" containerID="ab032eebaa648366b54360a4379ada4f9c40b6b944ec02c2338a50e26ce0a054" Feb 19 09:11:21 crc kubenswrapper[4788]: I0219 09:11:21.014777 4788 scope.go:117] "RemoveContainer" containerID="e8c9a156a40c423ac06997fa3fefd0442bba40c9fb97dfe85d3ebffdb0a72307" Feb 19 09:11:21 crc kubenswrapper[4788]: I0219 09:11:21.040368 4788 scope.go:117] "RemoveContainer" containerID="96dd64663f5f5db9e2bf53db859a4510b8487f7bd65bb58ed88ac495e57a16b7" Feb 19 09:11:21 crc kubenswrapper[4788]: I0219 09:11:21.685823 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6jfcf"] Feb 19 09:11:21 crc kubenswrapper[4788]: I0219 09:11:21.686063 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6jfcf" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="registry-server" containerID="cri-o://dad05abfac3ee7bc149ea460d1bcaeb0b7b9f1664655512277c36fd62ec2b499" gracePeriod=2 Feb 19 09:11:22 crc kubenswrapper[4788]: I0219 09:11:22.045474 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jq4z9"] Feb 19 09:11:22 crc kubenswrapper[4788]: I0219 09:11:22.058870 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jq4z9"] Feb 19 09:11:22 crc kubenswrapper[4788]: I0219 09:11:22.725398 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1f8c5a-64ff-4e33-a3d0-409d025d567b" path="/var/lib/kubelet/pods/bf1f8c5a-64ff-4e33-a3d0-409d025d567b/volumes" Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.555499 4788 generic.go:334] "Generic (PLEG): container finished" podID="562a8331-2771-4300-9d37-a0db3cac89a6" containerID="dad05abfac3ee7bc149ea460d1bcaeb0b7b9f1664655512277c36fd62ec2b499" exitCode=0 Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.555603 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jfcf" event={"ID":"562a8331-2771-4300-9d37-a0db3cac89a6","Type":"ContainerDied","Data":"dad05abfac3ee7bc149ea460d1bcaeb0b7b9f1664655512277c36fd62ec2b499"} Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.611334 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.763859 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-utilities\") pod \"562a8331-2771-4300-9d37-a0db3cac89a6\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.764036 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhdvv\" (UniqueName: \"kubernetes.io/projected/562a8331-2771-4300-9d37-a0db3cac89a6-kube-api-access-dhdvv\") pod \"562a8331-2771-4300-9d37-a0db3cac89a6\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.764087 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-catalog-content\") pod \"562a8331-2771-4300-9d37-a0db3cac89a6\" (UID: \"562a8331-2771-4300-9d37-a0db3cac89a6\") " Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.764692 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-utilities" (OuterVolumeSpecName: "utilities") pod "562a8331-2771-4300-9d37-a0db3cac89a6" (UID: "562a8331-2771-4300-9d37-a0db3cac89a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.780440 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/562a8331-2771-4300-9d37-a0db3cac89a6-kube-api-access-dhdvv" (OuterVolumeSpecName: "kube-api-access-dhdvv") pod "562a8331-2771-4300-9d37-a0db3cac89a6" (UID: "562a8331-2771-4300-9d37-a0db3cac89a6"). InnerVolumeSpecName "kube-api-access-dhdvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.816443 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "562a8331-2771-4300-9d37-a0db3cac89a6" (UID: "562a8331-2771-4300-9d37-a0db3cac89a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.866934 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhdvv\" (UniqueName: \"kubernetes.io/projected/562a8331-2771-4300-9d37-a0db3cac89a6-kube-api-access-dhdvv\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.866992 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:27 crc kubenswrapper[4788]: I0219 09:11:27.867002 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/562a8331-2771-4300-9d37-a0db3cac89a6-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.566689 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whbtx" event={"ID":"fc527cdf-b038-4c50-903f-816dd46ef453","Type":"ContainerStarted","Data":"b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6"} Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.569078 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6jfcf" event={"ID":"562a8331-2771-4300-9d37-a0db3cac89a6","Type":"ContainerDied","Data":"f6e09c8910ba3062db06f962e02f0e74f69ef3df486dde45db6e16353e0401b3"} Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.569130 4788 scope.go:117] "RemoveContainer" containerID="dad05abfac3ee7bc149ea460d1bcaeb0b7b9f1664655512277c36fd62ec2b499" Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.569186 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6jfcf" Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.591086 4788 scope.go:117] "RemoveContainer" containerID="61b0e99de294027d2e712770f9be71573fc2cfae8b7f185e2467f10f9a362813" Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.598771 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-whbtx" podStartSLOduration=2.908002901 podStartE2EDuration="12.598751908s" podCreationTimestamp="2026-02-19 09:11:16 +0000 UTC" firstStartedPulling="2026-02-19 09:11:17.629407783 +0000 UTC m=+1579.617419265" lastFinishedPulling="2026-02-19 09:11:27.3201568 +0000 UTC m=+1589.308168272" observedRunningTime="2026-02-19 09:11:28.587629569 +0000 UTC m=+1590.575641041" watchObservedRunningTime="2026-02-19 09:11:28.598751908 +0000 UTC m=+1590.586763380" Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.622177 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6jfcf"] Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.624498 4788 scope.go:117] "RemoveContainer" containerID="146227fa0b031a26677c3ec8ed931973b25e2f7f906366139f0abbb5d59edb82" Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.634388 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6jfcf"] Feb 19 09:11:28 crc kubenswrapper[4788]: I0219 09:11:28.728021 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" path="/var/lib/kubelet/pods/562a8331-2771-4300-9d37-a0db3cac89a6/volumes" Feb 19 09:11:30 crc kubenswrapper[4788]: I0219 09:11:30.714759 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:11:30 crc kubenswrapper[4788]: E0219 09:11:30.715591 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:11:31 crc kubenswrapper[4788]: I0219 09:11:31.031523 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jdh5n"] Feb 19 09:11:31 crc kubenswrapper[4788]: I0219 09:11:31.044311 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jdh5n"] Feb 19 09:11:32 crc kubenswrapper[4788]: I0219 09:11:32.727916 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="231f701c-d9c4-4157-bcf2-fe8875ce36e7" path="/var/lib/kubelet/pods/231f701c-d9c4-4157-bcf2-fe8875ce36e7/volumes" Feb 19 09:11:34 crc kubenswrapper[4788]: I0219 09:11:34.033772 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b159-account-create-update-lk6tl"] Feb 19 09:11:34 crc kubenswrapper[4788]: I0219 09:11:34.042985 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-07bb-account-create-update-kvcc2"] Feb 19 09:11:34 crc kubenswrapper[4788]: I0219 09:11:34.050494 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-07bb-account-create-update-kvcc2"] Feb 19 09:11:34 crc kubenswrapper[4788]: I0219 09:11:34.058416 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b159-account-create-update-lk6tl"] Feb 19 09:11:34 crc kubenswrapper[4788]: I0219 09:11:34.725365 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d07fa4d7-5916-4288-9b30-0413795f6a69" path="/var/lib/kubelet/pods/d07fa4d7-5916-4288-9b30-0413795f6a69/volumes" Feb 19 09:11:34 crc kubenswrapper[4788]: I0219 09:11:34.726317 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1223d0f-9cda-4590-9ae6-353c58886f99" path="/var/lib/kubelet/pods/d1223d0f-9cda-4590-9ae6-353c58886f99/volumes" Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.031316 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-brnnm"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.040882 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-d8ngr"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.053072 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0317-account-create-update-p9xsz"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.060823 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-5w8kn"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.068500 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-brnnm"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.076200 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-5w8kn"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.083502 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0317-account-create-update-p9xsz"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.091070 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-d8ngr"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.098034 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fbfe-account-create-update-jmjqh"] Feb 19 09:11:35 crc kubenswrapper[4788]: I0219 09:11:35.105770 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fbfe-account-create-update-jmjqh"] Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.528227 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s25nc"] Feb 19 09:11:36 crc kubenswrapper[4788]: E0219 09:11:36.528987 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="registry-server" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.529004 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="registry-server" Feb 19 09:11:36 crc kubenswrapper[4788]: E0219 09:11:36.529027 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="extract-utilities" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.529034 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="extract-utilities" Feb 19 09:11:36 crc kubenswrapper[4788]: E0219 09:11:36.529045 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="extract-content" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.529051 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="extract-content" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.529355 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="562a8331-2771-4300-9d37-a0db3cac89a6" containerName="registry-server" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.530896 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.538179 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s25nc"] Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.634943 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-utilities\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.635013 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncntn\" (UniqueName: \"kubernetes.io/projected/ba8f366b-facb-455e-a5cd-76d6e075cfe5-kube-api-access-ncntn\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.635042 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-catalog-content\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.655011 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.655062 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.704023 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.730107 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59cdc318-8f14-4606-aa81-1a16a1ed697b" path="/var/lib/kubelet/pods/59cdc318-8f14-4606-aa81-1a16a1ed697b/volumes" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.731292 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9a6da4-e188-4741-b5a4-60a33b8cd415" path="/var/lib/kubelet/pods/7b9a6da4-e188-4741-b5a4-60a33b8cd415/volumes" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.732125 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9eb8664-672b-45c0-a128-1e60f6ea6a0e" path="/var/lib/kubelet/pods/b9eb8664-672b-45c0-a128-1e60f6ea6a0e/volumes" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.732948 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d" path="/var/lib/kubelet/pods/ba7f5bc6-91a9-46ba-a937-f3ee0ff53a2d/volumes" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.734538 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da103dab-8e46-466c-90db-c237910cc9e7" path="/var/lib/kubelet/pods/da103dab-8e46-466c-90db-c237910cc9e7/volumes" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.736343 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-utilities\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.736423 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncntn\" (UniqueName: \"kubernetes.io/projected/ba8f366b-facb-455e-a5cd-76d6e075cfe5-kube-api-access-ncntn\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.736454 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-catalog-content\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.736897 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-utilities\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.736914 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-catalog-content\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.754645 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncntn\" (UniqueName: \"kubernetes.io/projected/ba8f366b-facb-455e-a5cd-76d6e075cfe5-kube-api-access-ncntn\") pod \"redhat-marketplace-s25nc\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:36 crc kubenswrapper[4788]: I0219 09:11:36.874473 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:37 crc kubenswrapper[4788]: I0219 09:11:37.315726 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s25nc"] Feb 19 09:11:37 crc kubenswrapper[4788]: I0219 09:11:37.651333 4788 generic.go:334] "Generic (PLEG): container finished" podID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerID="b78aeb0178005a07a3192d5e0232ce599847d74127f8a3a9d8bdda807ff10f1c" exitCode=0 Feb 19 09:11:37 crc kubenswrapper[4788]: I0219 09:11:37.651385 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s25nc" event={"ID":"ba8f366b-facb-455e-a5cd-76d6e075cfe5","Type":"ContainerDied","Data":"b78aeb0178005a07a3192d5e0232ce599847d74127f8a3a9d8bdda807ff10f1c"} Feb 19 09:11:37 crc kubenswrapper[4788]: I0219 09:11:37.651640 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s25nc" event={"ID":"ba8f366b-facb-455e-a5cd-76d6e075cfe5","Type":"ContainerStarted","Data":"bb3bdc74087b274c097f46ec342c4e68a849d76d7b73401010a54ce828880393"} Feb 19 09:11:37 crc kubenswrapper[4788]: I0219 09:11:37.701610 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:39 crc kubenswrapper[4788]: I0219 09:11:39.044839 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-zzzf4"] Feb 19 09:11:39 crc kubenswrapper[4788]: I0219 09:11:39.057320 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-zzzf4"] Feb 19 09:11:39 crc kubenswrapper[4788]: I0219 09:11:39.109538 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-whbtx"] Feb 19 09:11:39 crc kubenswrapper[4788]: I0219 09:11:39.670443 4788 generic.go:334] "Generic (PLEG): container finished" podID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerID="c783a6197a3c72ac50f1dafd1ce37f010edfb90bce860903a05bd6e5590498ce" exitCode=0 Feb 19 09:11:39 crc kubenswrapper[4788]: I0219 09:11:39.670513 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s25nc" event={"ID":"ba8f366b-facb-455e-a5cd-76d6e075cfe5","Type":"ContainerDied","Data":"c783a6197a3c72ac50f1dafd1ce37f010edfb90bce860903a05bd6e5590498ce"} Feb 19 09:11:39 crc kubenswrapper[4788]: I0219 09:11:39.670725 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-whbtx" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" containerName="registry-server" containerID="cri-o://b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6" gracePeriod=2 Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.256681 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.312858 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmcgk\" (UniqueName: \"kubernetes.io/projected/fc527cdf-b038-4c50-903f-816dd46ef453-kube-api-access-zmcgk\") pod \"fc527cdf-b038-4c50-903f-816dd46ef453\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.312925 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-catalog-content\") pod \"fc527cdf-b038-4c50-903f-816dd46ef453\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.313010 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-utilities\") pod \"fc527cdf-b038-4c50-903f-816dd46ef453\" (UID: \"fc527cdf-b038-4c50-903f-816dd46ef453\") " Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.314082 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-utilities" (OuterVolumeSpecName: "utilities") pod "fc527cdf-b038-4c50-903f-816dd46ef453" (UID: "fc527cdf-b038-4c50-903f-816dd46ef453"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.332548 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc527cdf-b038-4c50-903f-816dd46ef453-kube-api-access-zmcgk" (OuterVolumeSpecName: "kube-api-access-zmcgk") pod "fc527cdf-b038-4c50-903f-816dd46ef453" (UID: "fc527cdf-b038-4c50-903f-816dd46ef453"). InnerVolumeSpecName "kube-api-access-zmcgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.414581 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmcgk\" (UniqueName: \"kubernetes.io/projected/fc527cdf-b038-4c50-903f-816dd46ef453-kube-api-access-zmcgk\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.414610 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.456043 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc527cdf-b038-4c50-903f-816dd46ef453" (UID: "fc527cdf-b038-4c50-903f-816dd46ef453"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.515974 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc527cdf-b038-4c50-903f-816dd46ef453-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.682302 4788 generic.go:334] "Generic (PLEG): container finished" podID="fc527cdf-b038-4c50-903f-816dd46ef453" containerID="b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6" exitCode=0 Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.682348 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whbtx" event={"ID":"fc527cdf-b038-4c50-903f-816dd46ef453","Type":"ContainerDied","Data":"b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6"} Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.682380 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-whbtx" event={"ID":"fc527cdf-b038-4c50-903f-816dd46ef453","Type":"ContainerDied","Data":"b2b88fc9e0c8848f56f3d79eb5b6d4644690b71ce0313c4cbec105a43487d040"} Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.682404 4788 scope.go:117] "RemoveContainer" containerID="b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.682587 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-whbtx" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.705078 4788 scope.go:117] "RemoveContainer" containerID="71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.728007 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb2eae7-2f3d-424a-b805-b8452ceee91f" path="/var/lib/kubelet/pods/ccb2eae7-2f3d-424a-b805-b8452ceee91f/volumes" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.728957 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-whbtx"] Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.732586 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-whbtx"] Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.747494 4788 scope.go:117] "RemoveContainer" containerID="2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.791266 4788 scope.go:117] "RemoveContainer" containerID="b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6" Feb 19 09:11:40 crc kubenswrapper[4788]: E0219 09:11:40.791691 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6\": container with ID starting with b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6 not found: ID does not exist" containerID="b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.791718 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6"} err="failed to get container status \"b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6\": rpc error: code = NotFound desc = could not find container \"b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6\": container with ID starting with b3f9179531dc046384fcd69b2a840261fbd81bcdc22f5f1cf1476216588fd9b6 not found: ID does not exist" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.791738 4788 scope.go:117] "RemoveContainer" containerID="71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b" Feb 19 09:11:40 crc kubenswrapper[4788]: E0219 09:11:40.791909 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b\": container with ID starting with 71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b not found: ID does not exist" containerID="71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.791926 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b"} err="failed to get container status \"71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b\": rpc error: code = NotFound desc = could not find container \"71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b\": container with ID starting with 71624384a68af3bc6552b65bfdabf9bd1ec63eb3614018c0018464abb2f2226b not found: ID does not exist" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.791937 4788 scope.go:117] "RemoveContainer" containerID="2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a" Feb 19 09:11:40 crc kubenswrapper[4788]: E0219 09:11:40.792075 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a\": container with ID starting with 2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a not found: ID does not exist" containerID="2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a" Feb 19 09:11:40 crc kubenswrapper[4788]: I0219 09:11:40.792091 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a"} err="failed to get container status \"2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a\": rpc error: code = NotFound desc = could not find container \"2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a\": container with ID starting with 2c2c4975eef8e64d06225284ee5874113324188e71e2659c81da646f4730844a not found: ID does not exist" Feb 19 09:11:41 crc kubenswrapper[4788]: I0219 09:11:41.774306 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s25nc" event={"ID":"ba8f366b-facb-455e-a5cd-76d6e075cfe5","Type":"ContainerStarted","Data":"d59389d68b059cd88b21a7924043c095417f2e8b66d883b638a2d1cb9bf1e255"} Feb 19 09:11:41 crc kubenswrapper[4788]: I0219 09:11:41.800717 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s25nc" podStartSLOduration=3.052323747 podStartE2EDuration="5.800696562s" podCreationTimestamp="2026-02-19 09:11:36 +0000 UTC" firstStartedPulling="2026-02-19 09:11:37.653134675 +0000 UTC m=+1599.641146147" lastFinishedPulling="2026-02-19 09:11:40.40150749 +0000 UTC m=+1602.389518962" observedRunningTime="2026-02-19 09:11:41.794590469 +0000 UTC m=+1603.782601951" watchObservedRunningTime="2026-02-19 09:11:41.800696562 +0000 UTC m=+1603.788708024" Feb 19 09:11:42 crc kubenswrapper[4788]: I0219 09:11:42.726086 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" path="/var/lib/kubelet/pods/fc527cdf-b038-4c50-903f-816dd46ef453/volumes" Feb 19 09:11:43 crc kubenswrapper[4788]: I0219 09:11:43.714344 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:11:43 crc kubenswrapper[4788]: E0219 09:11:43.714595 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:11:46 crc kubenswrapper[4788]: I0219 09:11:46.875040 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:46 crc kubenswrapper[4788]: I0219 09:11:46.875087 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:46 crc kubenswrapper[4788]: I0219 09:11:46.928818 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:47 crc kubenswrapper[4788]: I0219 09:11:47.901902 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:47 crc kubenswrapper[4788]: I0219 09:11:47.953486 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s25nc"] Feb 19 09:11:49 crc kubenswrapper[4788]: I0219 09:11:49.867974 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s25nc" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerName="registry-server" containerID="cri-o://d59389d68b059cd88b21a7924043c095417f2e8b66d883b638a2d1cb9bf1e255" gracePeriod=2 Feb 19 09:11:50 crc kubenswrapper[4788]: I0219 09:11:50.881580 4788 generic.go:334] "Generic (PLEG): container finished" podID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerID="d59389d68b059cd88b21a7924043c095417f2e8b66d883b638a2d1cb9bf1e255" exitCode=0 Feb 19 09:11:50 crc kubenswrapper[4788]: I0219 09:11:50.881638 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s25nc" event={"ID":"ba8f366b-facb-455e-a5cd-76d6e075cfe5","Type":"ContainerDied","Data":"d59389d68b059cd88b21a7924043c095417f2e8b66d883b638a2d1cb9bf1e255"} Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.147361 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.258627 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-utilities\") pod \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.258910 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncntn\" (UniqueName: \"kubernetes.io/projected/ba8f366b-facb-455e-a5cd-76d6e075cfe5-kube-api-access-ncntn\") pod \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.259098 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-catalog-content\") pod \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\" (UID: \"ba8f366b-facb-455e-a5cd-76d6e075cfe5\") " Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.259685 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-utilities" (OuterVolumeSpecName: "utilities") pod "ba8f366b-facb-455e-a5cd-76d6e075cfe5" (UID: "ba8f366b-facb-455e-a5cd-76d6e075cfe5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.259898 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.264375 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8f366b-facb-455e-a5cd-76d6e075cfe5-kube-api-access-ncntn" (OuterVolumeSpecName: "kube-api-access-ncntn") pod "ba8f366b-facb-455e-a5cd-76d6e075cfe5" (UID: "ba8f366b-facb-455e-a5cd-76d6e075cfe5"). InnerVolumeSpecName "kube-api-access-ncntn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.302563 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba8f366b-facb-455e-a5cd-76d6e075cfe5" (UID: "ba8f366b-facb-455e-a5cd-76d6e075cfe5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.361555 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncntn\" (UniqueName: \"kubernetes.io/projected/ba8f366b-facb-455e-a5cd-76d6e075cfe5-kube-api-access-ncntn\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.361590 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba8f366b-facb-455e-a5cd-76d6e075cfe5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.911621 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s25nc" event={"ID":"ba8f366b-facb-455e-a5cd-76d6e075cfe5","Type":"ContainerDied","Data":"bb3bdc74087b274c097f46ec342c4e68a849d76d7b73401010a54ce828880393"} Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.911668 4788 scope.go:117] "RemoveContainer" containerID="d59389d68b059cd88b21a7924043c095417f2e8b66d883b638a2d1cb9bf1e255" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.912017 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s25nc" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.935998 4788 scope.go:117] "RemoveContainer" containerID="c783a6197a3c72ac50f1dafd1ce37f010edfb90bce860903a05bd6e5590498ce" Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.942220 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s25nc"] Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.950616 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s25nc"] Feb 19 09:11:52 crc kubenswrapper[4788]: I0219 09:11:52.964387 4788 scope.go:117] "RemoveContainer" containerID="b78aeb0178005a07a3192d5e0232ce599847d74127f8a3a9d8bdda807ff10f1c" Feb 19 09:11:54 crc kubenswrapper[4788]: I0219 09:11:54.727204 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" path="/var/lib/kubelet/pods/ba8f366b-facb-455e-a5cd-76d6e075cfe5/volumes" Feb 19 09:11:57 crc kubenswrapper[4788]: I0219 09:11:57.714435 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:11:57 crc kubenswrapper[4788]: E0219 09:11:57.715907 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:11:59 crc kubenswrapper[4788]: I0219 09:11:59.990897 4788 generic.go:334] "Generic (PLEG): container finished" podID="952e8fd1-9634-4af1-8f56-62068214b66c" containerID="628c5e93adf4cbf402ad8b15f4caade4fe8cb7eb93884155a62483a09908213a" exitCode=0 Feb 19 09:11:59 crc kubenswrapper[4788]: I0219 09:11:59.991426 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" event={"ID":"952e8fd1-9634-4af1-8f56-62068214b66c","Type":"ContainerDied","Data":"628c5e93adf4cbf402ad8b15f4caade4fe8cb7eb93884155a62483a09908213a"} Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.409723 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.549816 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-inventory\") pod \"952e8fd1-9634-4af1-8f56-62068214b66c\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.549935 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qcrz\" (UniqueName: \"kubernetes.io/projected/952e8fd1-9634-4af1-8f56-62068214b66c-kube-api-access-6qcrz\") pod \"952e8fd1-9634-4af1-8f56-62068214b66c\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.550000 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-ssh-key-openstack-edpm-ipam\") pod \"952e8fd1-9634-4af1-8f56-62068214b66c\" (UID: \"952e8fd1-9634-4af1-8f56-62068214b66c\") " Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.555392 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952e8fd1-9634-4af1-8f56-62068214b66c-kube-api-access-6qcrz" (OuterVolumeSpecName: "kube-api-access-6qcrz") pod "952e8fd1-9634-4af1-8f56-62068214b66c" (UID: "952e8fd1-9634-4af1-8f56-62068214b66c"). InnerVolumeSpecName "kube-api-access-6qcrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.577154 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "952e8fd1-9634-4af1-8f56-62068214b66c" (UID: "952e8fd1-9634-4af1-8f56-62068214b66c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.580477 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-inventory" (OuterVolumeSpecName: "inventory") pod "952e8fd1-9634-4af1-8f56-62068214b66c" (UID: "952e8fd1-9634-4af1-8f56-62068214b66c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.652855 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.652909 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qcrz\" (UniqueName: \"kubernetes.io/projected/952e8fd1-9634-4af1-8f56-62068214b66c-kube-api-access-6qcrz\") on node \"crc\" DevicePath \"\"" Feb 19 09:12:01 crc kubenswrapper[4788]: I0219 09:12:01.652922 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/952e8fd1-9634-4af1-8f56-62068214b66c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.013610 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" event={"ID":"952e8fd1-9634-4af1-8f56-62068214b66c","Type":"ContainerDied","Data":"c34a2f8c9b6c24362ea924d33a45332b6732ad645201a7e699d7612ef9e3939f"} Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.013650 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c34a2f8c9b6c24362ea924d33a45332b6732ad645201a7e699d7612ef9e3939f" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.013659 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.121920 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5"] Feb 19 09:12:02 crc kubenswrapper[4788]: E0219 09:12:02.122536 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerName="extract-utilities" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.122565 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerName="extract-utilities" Feb 19 09:12:02 crc kubenswrapper[4788]: E0219 09:12:02.122600 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerName="extract-content" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.122613 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerName="extract-content" Feb 19 09:12:02 crc kubenswrapper[4788]: E0219 09:12:02.122631 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" containerName="registry-server" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.122643 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" containerName="registry-server" Feb 19 09:12:02 crc kubenswrapper[4788]: E0219 09:12:02.122669 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerName="registry-server" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.122679 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerName="registry-server" Feb 19 09:12:02 crc kubenswrapper[4788]: E0219 09:12:02.122708 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952e8fd1-9634-4af1-8f56-62068214b66c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.122721 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="952e8fd1-9634-4af1-8f56-62068214b66c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 09:12:02 crc kubenswrapper[4788]: E0219 09:12:02.122741 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" containerName="extract-utilities" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.122754 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" containerName="extract-utilities" Feb 19 09:12:02 crc kubenswrapper[4788]: E0219 09:12:02.122768 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" containerName="extract-content" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.122778 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" containerName="extract-content" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.123093 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc527cdf-b038-4c50-903f-816dd46ef453" containerName="registry-server" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.123130 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f366b-facb-455e-a5cd-76d6e075cfe5" containerName="registry-server" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.123147 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="952e8fd1-9634-4af1-8f56-62068214b66c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.124083 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.126547 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.127097 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.128391 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.128676 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.149115 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5"] Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.265129 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85p7q\" (UniqueName: \"kubernetes.io/projected/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-kube-api-access-85p7q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.265384 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.265772 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.366684 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.366755 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85p7q\" (UniqueName: \"kubernetes.io/projected/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-kube-api-access-85p7q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.366808 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.371367 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.373727 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.399331 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85p7q\" (UniqueName: \"kubernetes.io/projected/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-kube-api-access-85p7q\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:02 crc kubenswrapper[4788]: I0219 09:12:02.444026 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:12:03 crc kubenswrapper[4788]: I0219 09:12:03.082183 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5"] Feb 19 09:12:04 crc kubenswrapper[4788]: I0219 09:12:04.032194 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" event={"ID":"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f","Type":"ContainerStarted","Data":"6da8f0d955d680fe250f639af68f625a0eb9a7bd35f5b69d1a3142203293f0a6"} Feb 19 09:12:05 crc kubenswrapper[4788]: I0219 09:12:05.045474 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" event={"ID":"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f","Type":"ContainerStarted","Data":"cb8bafa838e5dd0a6ed482c918d3544e0c265b789377b2d71375a39d4a3fdb70"} Feb 19 09:12:05 crc kubenswrapper[4788]: I0219 09:12:05.080639 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" podStartSLOduration=1.940181644 podStartE2EDuration="3.080601846s" podCreationTimestamp="2026-02-19 09:12:02 +0000 UTC" firstStartedPulling="2026-02-19 09:12:03.086131708 +0000 UTC m=+1625.074143200" lastFinishedPulling="2026-02-19 09:12:04.22655193 +0000 UTC m=+1626.214563402" observedRunningTime="2026-02-19 09:12:05.068792619 +0000 UTC m=+1627.056804101" watchObservedRunningTime="2026-02-19 09:12:05.080601846 +0000 UTC m=+1627.068613358" Feb 19 09:12:08 crc kubenswrapper[4788]: I0219 09:12:08.714839 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:12:08 crc kubenswrapper[4788]: E0219 09:12:08.716017 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:12:10 crc kubenswrapper[4788]: I0219 09:12:10.045989 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ps9px"] Feb 19 09:12:10 crc kubenswrapper[4788]: I0219 09:12:10.055118 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ps9px"] Feb 19 09:12:10 crc kubenswrapper[4788]: I0219 09:12:10.724866 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528e3c62-47f6-4cf1-8b32-06dd6657c9f6" path="/var/lib/kubelet/pods/528e3c62-47f6-4cf1-8b32-06dd6657c9f6/volumes" Feb 19 09:12:19 crc kubenswrapper[4788]: I0219 09:12:19.714548 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:12:19 crc kubenswrapper[4788]: E0219 09:12:19.715337 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:12:21 crc kubenswrapper[4788]: I0219 09:12:21.751530 4788 scope.go:117] "RemoveContainer" containerID="9820c56563cc79ee44a2fc15ef198e67fc06cb632c550f119cd9c88baa91b836" Feb 19 09:12:21 crc kubenswrapper[4788]: I0219 09:12:21.791007 4788 scope.go:117] "RemoveContainer" containerID="e16ff6da4ee97e161bb886b720b207a7f7e2740a3dbc8460a121b42b125b0f93" Feb 19 09:12:21 crc kubenswrapper[4788]: I0219 09:12:21.828805 4788 scope.go:117] "RemoveContainer" containerID="03c8e46258faa59402c50956ae0fdb7548577672ca54b8637f9ef9880d95928f" Feb 19 09:12:21 crc kubenswrapper[4788]: I0219 09:12:21.909217 4788 scope.go:117] "RemoveContainer" containerID="4a9e154742d18f92d38fe4f84c0426a40a4f18da975faeaf7c398a442720a2a5" Feb 19 09:12:21 crc kubenswrapper[4788]: I0219 09:12:21.950895 4788 scope.go:117] "RemoveContainer" containerID="b684a72825a8b22fe4690f2297d04452cb5f4f0fdd16d107ac50e805be889f68" Feb 19 09:12:21 crc kubenswrapper[4788]: I0219 09:12:21.995022 4788 scope.go:117] "RemoveContainer" containerID="09c8b320d144ac0814de10a4fb2b0761acc8a243245d4f68d2647914529742be" Feb 19 09:12:22 crc kubenswrapper[4788]: I0219 09:12:22.035447 4788 scope.go:117] "RemoveContainer" containerID="1afe1fa0b2fdfb5e25d59edf4360695744b01c8e3bcbc4bc0ea9e750f57884eb" Feb 19 09:12:22 crc kubenswrapper[4788]: I0219 09:12:22.083037 4788 scope.go:117] "RemoveContainer" containerID="dd13df32eb25b537aa018234d33c9f9669b9d9298b92023335532fb4b817d8b8" Feb 19 09:12:22 crc kubenswrapper[4788]: I0219 09:12:22.125019 4788 scope.go:117] "RemoveContainer" containerID="78057c63673a15e66a2ca94e3c6e6f824657a9acd19f9a6319b211f221f08fd0" Feb 19 09:12:22 crc kubenswrapper[4788]: I0219 09:12:22.157777 4788 scope.go:117] "RemoveContainer" containerID="960586f2ddb2bd161f18289f539caa6c3a3e969df144acb5a08f9c63b0606487" Feb 19 09:12:22 crc kubenswrapper[4788]: I0219 09:12:22.201871 4788 scope.go:117] "RemoveContainer" containerID="074375f36cab7ec00e9b4968ca05f676bd5421a49af0a0c78ed0bcee0a498ba9" Feb 19 09:12:32 crc kubenswrapper[4788]: I0219 09:12:32.049873 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-p5cvn"] Feb 19 09:12:32 crc kubenswrapper[4788]: I0219 09:12:32.058287 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-p5cvn"] Feb 19 09:12:32 crc kubenswrapper[4788]: I0219 09:12:32.736061 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb605f6-3946-4a0b-b492-6b011811ec43" path="/var/lib/kubelet/pods/1bb605f6-3946-4a0b-b492-6b011811ec43/volumes" Feb 19 09:12:33 crc kubenswrapper[4788]: I0219 09:12:33.714717 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:12:33 crc kubenswrapper[4788]: E0219 09:12:33.715411 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:12:39 crc kubenswrapper[4788]: I0219 09:12:39.056824 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wllnd"] Feb 19 09:12:39 crc kubenswrapper[4788]: I0219 09:12:39.067705 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2dgzz"] Feb 19 09:12:39 crc kubenswrapper[4788]: I0219 09:12:39.078051 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wllnd"] Feb 19 09:12:39 crc kubenswrapper[4788]: I0219 09:12:39.087846 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2dgzz"] Feb 19 09:12:40 crc kubenswrapper[4788]: I0219 09:12:40.047356 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-wf626"] Feb 19 09:12:40 crc kubenswrapper[4788]: I0219 09:12:40.068668 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-wf626"] Feb 19 09:12:40 crc kubenswrapper[4788]: I0219 09:12:40.724571 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="145312e4-8a69-4c17-964b-2183e2ff66b4" path="/var/lib/kubelet/pods/145312e4-8a69-4c17-964b-2183e2ff66b4/volumes" Feb 19 09:12:40 crc kubenswrapper[4788]: I0219 09:12:40.725174 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708e9c03-709f-4846-bf0f-abb71c9e164f" path="/var/lib/kubelet/pods/708e9c03-709f-4846-bf0f-abb71c9e164f/volumes" Feb 19 09:12:40 crc kubenswrapper[4788]: I0219 09:12:40.725833 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7fed6a4-4d87-463f-84d2-942c28422b8b" path="/var/lib/kubelet/pods/c7fed6a4-4d87-463f-84d2-942c28422b8b/volumes" Feb 19 09:12:41 crc kubenswrapper[4788]: I0219 09:12:41.025374 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-8lpgx"] Feb 19 09:12:41 crc kubenswrapper[4788]: I0219 09:12:41.032929 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-8lpgx"] Feb 19 09:12:42 crc kubenswrapper[4788]: I0219 09:12:42.739011 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a3acb8-146c-47c0-9218-81cd2728edf9" path="/var/lib/kubelet/pods/c2a3acb8-146c-47c0-9218-81cd2728edf9/volumes" Feb 19 09:12:45 crc kubenswrapper[4788]: I0219 09:12:45.714434 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:12:45 crc kubenswrapper[4788]: E0219 09:12:45.714786 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:13:00 crc kubenswrapper[4788]: I0219 09:13:00.714890 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:13:00 crc kubenswrapper[4788]: E0219 09:13:00.715745 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:13:07 crc kubenswrapper[4788]: I0219 09:13:07.698820 4788 generic.go:334] "Generic (PLEG): container finished" podID="e16c6a7e-f78e-45bd-9b1f-60d61f04e91f" containerID="cb8bafa838e5dd0a6ed482c918d3544e0c265b789377b2d71375a39d4a3fdb70" exitCode=0 Feb 19 09:13:07 crc kubenswrapper[4788]: I0219 09:13:07.698885 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" event={"ID":"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f","Type":"ContainerDied","Data":"cb8bafa838e5dd0a6ed482c918d3544e0c265b789377b2d71375a39d4a3fdb70"} Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.165465 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.257439 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-inventory\") pod \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.257644 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85p7q\" (UniqueName: \"kubernetes.io/projected/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-kube-api-access-85p7q\") pod \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.257699 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-ssh-key-openstack-edpm-ipam\") pod \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\" (UID: \"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f\") " Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.263483 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-kube-api-access-85p7q" (OuterVolumeSpecName: "kube-api-access-85p7q") pod "e16c6a7e-f78e-45bd-9b1f-60d61f04e91f" (UID: "e16c6a7e-f78e-45bd-9b1f-60d61f04e91f"). InnerVolumeSpecName "kube-api-access-85p7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.283983 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e16c6a7e-f78e-45bd-9b1f-60d61f04e91f" (UID: "e16c6a7e-f78e-45bd-9b1f-60d61f04e91f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.304395 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-inventory" (OuterVolumeSpecName: "inventory") pod "e16c6a7e-f78e-45bd-9b1f-60d61f04e91f" (UID: "e16c6a7e-f78e-45bd-9b1f-60d61f04e91f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.360564 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85p7q\" (UniqueName: \"kubernetes.io/projected/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-kube-api-access-85p7q\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.360605 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.360617 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e16c6a7e-f78e-45bd-9b1f-60d61f04e91f-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.717163 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" event={"ID":"e16c6a7e-f78e-45bd-9b1f-60d61f04e91f","Type":"ContainerDied","Data":"6da8f0d955d680fe250f639af68f625a0eb9a7bd35f5b69d1a3142203293f0a6"} Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.717215 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da8f0d955d680fe250f639af68f625a0eb9a7bd35f5b69d1a3142203293f0a6" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.717292 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.801425 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx"] Feb 19 09:13:09 crc kubenswrapper[4788]: E0219 09:13:09.801900 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16c6a7e-f78e-45bd-9b1f-60d61f04e91f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.801923 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16c6a7e-f78e-45bd-9b1f-60d61f04e91f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.802151 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16c6a7e-f78e-45bd-9b1f-60d61f04e91f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.802934 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.807349 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.807408 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.807426 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.807354 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.828935 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx"] Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.873586 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.873651 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvll7\" (UniqueName: \"kubernetes.io/projected/c285f791-b6be-44ce-8534-6196c12656ad-kube-api-access-rvll7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.873686 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.976142 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.976202 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvll7\" (UniqueName: \"kubernetes.io/projected/c285f791-b6be-44ce-8534-6196c12656ad-kube-api-access-rvll7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.976233 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.981741 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.985896 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:09 crc kubenswrapper[4788]: I0219 09:13:09.997854 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvll7\" (UniqueName: \"kubernetes.io/projected/c285f791-b6be-44ce-8534-6196c12656ad-kube-api-access-rvll7\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:10 crc kubenswrapper[4788]: I0219 09:13:10.127979 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:10 crc kubenswrapper[4788]: I0219 09:13:10.832852 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx"] Feb 19 09:13:11 crc kubenswrapper[4788]: I0219 09:13:11.737985 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" event={"ID":"c285f791-b6be-44ce-8534-6196c12656ad","Type":"ContainerStarted","Data":"bfe512528f68b0a55262c70ec6148a31a1fa936a8b60b28032b4d72070e983c9"} Feb 19 09:13:11 crc kubenswrapper[4788]: I0219 09:13:11.738493 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" event={"ID":"c285f791-b6be-44ce-8534-6196c12656ad","Type":"ContainerStarted","Data":"c7a3d2408b254ec17ca1ec88b50a7c7da194deca112d737cb5794a0810e1b273"} Feb 19 09:13:12 crc kubenswrapper[4788]: I0219 09:13:12.780775 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" podStartSLOduration=3.147145128 podStartE2EDuration="3.780742375s" podCreationTimestamp="2026-02-19 09:13:09 +0000 UTC" firstStartedPulling="2026-02-19 09:13:10.841275872 +0000 UTC m=+1692.829287344" lastFinishedPulling="2026-02-19 09:13:11.474873109 +0000 UTC m=+1693.462884591" observedRunningTime="2026-02-19 09:13:12.77038653 +0000 UTC m=+1694.758398012" watchObservedRunningTime="2026-02-19 09:13:12.780742375 +0000 UTC m=+1694.768753847" Feb 19 09:13:14 crc kubenswrapper[4788]: I0219 09:13:14.714021 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:13:14 crc kubenswrapper[4788]: E0219 09:13:14.714586 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:13:16 crc kubenswrapper[4788]: I0219 09:13:16.788893 4788 generic.go:334] "Generic (PLEG): container finished" podID="c285f791-b6be-44ce-8534-6196c12656ad" containerID="bfe512528f68b0a55262c70ec6148a31a1fa936a8b60b28032b4d72070e983c9" exitCode=0 Feb 19 09:13:16 crc kubenswrapper[4788]: I0219 09:13:16.789003 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" event={"ID":"c285f791-b6be-44ce-8534-6196c12656ad","Type":"ContainerDied","Data":"bfe512528f68b0a55262c70ec6148a31a1fa936a8b60b28032b4d72070e983c9"} Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.284364 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.338171 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvll7\" (UniqueName: \"kubernetes.io/projected/c285f791-b6be-44ce-8534-6196c12656ad-kube-api-access-rvll7\") pod \"c285f791-b6be-44ce-8534-6196c12656ad\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.338395 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-ssh-key-openstack-edpm-ipam\") pod \"c285f791-b6be-44ce-8534-6196c12656ad\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.338429 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-inventory\") pod \"c285f791-b6be-44ce-8534-6196c12656ad\" (UID: \"c285f791-b6be-44ce-8534-6196c12656ad\") " Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.349577 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c285f791-b6be-44ce-8534-6196c12656ad-kube-api-access-rvll7" (OuterVolumeSpecName: "kube-api-access-rvll7") pod "c285f791-b6be-44ce-8534-6196c12656ad" (UID: "c285f791-b6be-44ce-8534-6196c12656ad"). InnerVolumeSpecName "kube-api-access-rvll7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.372466 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-inventory" (OuterVolumeSpecName: "inventory") pod "c285f791-b6be-44ce-8534-6196c12656ad" (UID: "c285f791-b6be-44ce-8534-6196c12656ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.379574 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c285f791-b6be-44ce-8534-6196c12656ad" (UID: "c285f791-b6be-44ce-8534-6196c12656ad"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.441319 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.441471 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c285f791-b6be-44ce-8534-6196c12656ad-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.441542 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvll7\" (UniqueName: \"kubernetes.io/projected/c285f791-b6be-44ce-8534-6196c12656ad-kube-api-access-rvll7\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.818458 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" event={"ID":"c285f791-b6be-44ce-8534-6196c12656ad","Type":"ContainerDied","Data":"c7a3d2408b254ec17ca1ec88b50a7c7da194deca112d737cb5794a0810e1b273"} Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.821205 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7a3d2408b254ec17ca1ec88b50a7c7da194deca112d737cb5794a0810e1b273" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.818858 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.920815 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d"] Feb 19 09:13:18 crc kubenswrapper[4788]: E0219 09:13:18.921468 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c285f791-b6be-44ce-8534-6196c12656ad" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.921497 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c285f791-b6be-44ce-8534-6196c12656ad" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.921856 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c285f791-b6be-44ce-8534-6196c12656ad" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.923173 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.928187 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.928525 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.929539 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.930002 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:13:18 crc kubenswrapper[4788]: I0219 09:13:18.934771 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d"] Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.066033 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsshh\" (UniqueName: \"kubernetes.io/projected/ffef0c7e-2933-4173-ac71-b61fa297cad9-kube-api-access-gsshh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.066728 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.067096 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.169506 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.169582 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.169636 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsshh\" (UniqueName: \"kubernetes.io/projected/ffef0c7e-2933-4173-ac71-b61fa297cad9-kube-api-access-gsshh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.176387 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.204493 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsshh\" (UniqueName: \"kubernetes.io/projected/ffef0c7e-2933-4173-ac71-b61fa297cad9-kube-api-access-gsshh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.217349 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-chr7d\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.303680 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:19 crc kubenswrapper[4788]: I0219 09:13:19.983187 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d"] Feb 19 09:13:20 crc kubenswrapper[4788]: I0219 09:13:20.842327 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" event={"ID":"ffef0c7e-2933-4173-ac71-b61fa297cad9","Type":"ContainerStarted","Data":"765212c0df5e593d3efb8ae0e3b8dcf2d31d70558f3d81cb2128cb78a9cccb94"} Feb 19 09:13:21 crc kubenswrapper[4788]: I0219 09:13:21.857406 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" event={"ID":"ffef0c7e-2933-4173-ac71-b61fa297cad9","Type":"ContainerStarted","Data":"e9143ef1707aa230b70d7af75ba0f127617b0f214cd7175959aeb7d29375071c"} Feb 19 09:13:21 crc kubenswrapper[4788]: I0219 09:13:21.885291 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" podStartSLOduration=2.494524662 podStartE2EDuration="3.8852217s" podCreationTimestamp="2026-02-19 09:13:18 +0000 UTC" firstStartedPulling="2026-02-19 09:13:19.998117568 +0000 UTC m=+1701.986129040" lastFinishedPulling="2026-02-19 09:13:21.388814606 +0000 UTC m=+1703.376826078" observedRunningTime="2026-02-19 09:13:21.876657398 +0000 UTC m=+1703.864668880" watchObservedRunningTime="2026-02-19 09:13:21.8852217 +0000 UTC m=+1703.873233172" Feb 19 09:13:22 crc kubenswrapper[4788]: I0219 09:13:22.497671 4788 scope.go:117] "RemoveContainer" containerID="1d69b08f03a65afdc2102ef07ccfb54e9ca1781bf3905d5c42f892f9453078e1" Feb 19 09:13:22 crc kubenswrapper[4788]: I0219 09:13:22.533099 4788 scope.go:117] "RemoveContainer" containerID="9b492f113d579d3a32fbe3bf60ec81d340c3244557313d4ea483895235b6e6df" Feb 19 09:13:22 crc kubenswrapper[4788]: I0219 09:13:22.599360 4788 scope.go:117] "RemoveContainer" containerID="6f37c7d43101773aee7e9873708bce865b6a29baa2a68fbbe0db46bcb69bdbe4" Feb 19 09:13:22 crc kubenswrapper[4788]: I0219 09:13:22.653303 4788 scope.go:117] "RemoveContainer" containerID="62e0209a0aafc547777ddc5ebb626e3bf3ed3823d504a8e54df4338e120db24a" Feb 19 09:13:22 crc kubenswrapper[4788]: I0219 09:13:22.693200 4788 scope.go:117] "RemoveContainer" containerID="586d43b2216b6b3a2959b5d5b920474f8cbfddf9badec5646f683200069c85d3" Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.046510 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9025-account-create-update-nrk5c"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.056424 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-cshbz"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.069136 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-s7cvf"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.079181 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qvgbw"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.105629 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6d1f-account-create-update-sxrgh"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.118893 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6d1f-account-create-update-sxrgh"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.128739 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qvgbw"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.136361 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-s7cvf"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.145615 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d128-account-create-update-8ptfd"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.154583 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9025-account-create-update-nrk5c"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.165500 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d128-account-create-update-8ptfd"] Feb 19 09:13:27 crc kubenswrapper[4788]: I0219 09:13:27.173797 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-cshbz"] Feb 19 09:13:28 crc kubenswrapper[4788]: I0219 09:13:28.726350 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15bcd5c9-1256-4778-b245-3f19bc742903" path="/var/lib/kubelet/pods/15bcd5c9-1256-4778-b245-3f19bc742903/volumes" Feb 19 09:13:28 crc kubenswrapper[4788]: I0219 09:13:28.727173 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0102d8-4abf-499f-bfe8-149ace187639" path="/var/lib/kubelet/pods/2e0102d8-4abf-499f-bfe8-149ace187639/volumes" Feb 19 09:13:28 crc kubenswrapper[4788]: I0219 09:13:28.727763 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a0bc66-750b-4618-bd07-033c189eafcf" path="/var/lib/kubelet/pods/43a0bc66-750b-4618-bd07-033c189eafcf/volumes" Feb 19 09:13:28 crc kubenswrapper[4788]: I0219 09:13:28.728313 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60fe20ca-6454-4f38-90ab-16facbb9fb53" path="/var/lib/kubelet/pods/60fe20ca-6454-4f38-90ab-16facbb9fb53/volumes" Feb 19 09:13:28 crc kubenswrapper[4788]: I0219 09:13:28.729336 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc3972e-f29c-430c-9da0-29f51a8e6a47" path="/var/lib/kubelet/pods/bbc3972e-f29c-430c-9da0-29f51a8e6a47/volumes" Feb 19 09:13:28 crc kubenswrapper[4788]: I0219 09:13:28.729998 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2289eb-977d-42f1-a70a-772737cc197a" path="/var/lib/kubelet/pods/db2289eb-977d-42f1-a70a-772737cc197a/volumes" Feb 19 09:13:29 crc kubenswrapper[4788]: I0219 09:13:29.714076 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:13:29 crc kubenswrapper[4788]: E0219 09:13:29.714360 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:13:43 crc kubenswrapper[4788]: I0219 09:13:43.715330 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:13:43 crc kubenswrapper[4788]: E0219 09:13:43.716333 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:13:56 crc kubenswrapper[4788]: I0219 09:13:56.206851 4788 generic.go:334] "Generic (PLEG): container finished" podID="ffef0c7e-2933-4173-ac71-b61fa297cad9" containerID="e9143ef1707aa230b70d7af75ba0f127617b0f214cd7175959aeb7d29375071c" exitCode=0 Feb 19 09:13:56 crc kubenswrapper[4788]: I0219 09:13:56.206982 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" event={"ID":"ffef0c7e-2933-4173-ac71-b61fa297cad9","Type":"ContainerDied","Data":"e9143ef1707aa230b70d7af75ba0f127617b0f214cd7175959aeb7d29375071c"} Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.657754 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.690304 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsshh\" (UniqueName: \"kubernetes.io/projected/ffef0c7e-2933-4173-ac71-b61fa297cad9-kube-api-access-gsshh\") pod \"ffef0c7e-2933-4173-ac71-b61fa297cad9\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.690834 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-ssh-key-openstack-edpm-ipam\") pod \"ffef0c7e-2933-4173-ac71-b61fa297cad9\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.690881 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-inventory\") pod \"ffef0c7e-2933-4173-ac71-b61fa297cad9\" (UID: \"ffef0c7e-2933-4173-ac71-b61fa297cad9\") " Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.700639 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffef0c7e-2933-4173-ac71-b61fa297cad9-kube-api-access-gsshh" (OuterVolumeSpecName: "kube-api-access-gsshh") pod "ffef0c7e-2933-4173-ac71-b61fa297cad9" (UID: "ffef0c7e-2933-4173-ac71-b61fa297cad9"). InnerVolumeSpecName "kube-api-access-gsshh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.715164 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:13:57 crc kubenswrapper[4788]: E0219 09:13:57.715643 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.725931 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-inventory" (OuterVolumeSpecName: "inventory") pod "ffef0c7e-2933-4173-ac71-b61fa297cad9" (UID: "ffef0c7e-2933-4173-ac71-b61fa297cad9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.726981 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ffef0c7e-2933-4173-ac71-b61fa297cad9" (UID: "ffef0c7e-2933-4173-ac71-b61fa297cad9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.794846 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.794910 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsshh\" (UniqueName: \"kubernetes.io/projected/ffef0c7e-2933-4173-ac71-b61fa297cad9-kube-api-access-gsshh\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:57 crc kubenswrapper[4788]: I0219 09:13:57.794936 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ffef0c7e-2933-4173-ac71-b61fa297cad9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.230985 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" event={"ID":"ffef0c7e-2933-4173-ac71-b61fa297cad9","Type":"ContainerDied","Data":"765212c0df5e593d3efb8ae0e3b8dcf2d31d70558f3d81cb2128cb78a9cccb94"} Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.231033 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="765212c0df5e593d3efb8ae0e3b8dcf2d31d70558f3d81cb2128cb78a9cccb94" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.231068 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-chr7d" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.319515 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p"] Feb 19 09:13:58 crc kubenswrapper[4788]: E0219 09:13:58.319915 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffef0c7e-2933-4173-ac71-b61fa297cad9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.319933 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffef0c7e-2933-4173-ac71-b61fa297cad9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.320143 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffef0c7e-2933-4173-ac71-b61fa297cad9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.320797 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.325613 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.325647 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.325625 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.325870 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.327294 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p"] Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.410324 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwft6\" (UniqueName: \"kubernetes.io/projected/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-kube-api-access-jwft6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.410508 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.410537 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.513408 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.513504 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.514706 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwft6\" (UniqueName: \"kubernetes.io/projected/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-kube-api-access-jwft6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.521032 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.521536 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.535102 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwft6\" (UniqueName: \"kubernetes.io/projected/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-kube-api-access-jwft6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rl87p\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:58 crc kubenswrapper[4788]: I0219 09:13:58.642765 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:13:59 crc kubenswrapper[4788]: I0219 09:13:59.209565 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p"] Feb 19 09:13:59 crc kubenswrapper[4788]: I0219 09:13:59.244725 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" event={"ID":"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e","Type":"ContainerStarted","Data":"0ae94ab3e811da34083f0da24c56755002b0f1d0f5170848952b2e761d0320ae"} Feb 19 09:14:00 crc kubenswrapper[4788]: I0219 09:14:00.342492 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:14:01 crc kubenswrapper[4788]: I0219 09:14:01.264606 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" event={"ID":"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e","Type":"ContainerStarted","Data":"53eb993fc320772255a3c35acd2d52b78ae47acdb327b76903d3f9272daa8095"} Feb 19 09:14:01 crc kubenswrapper[4788]: I0219 09:14:01.287781 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" podStartSLOduration=2.161966794 podStartE2EDuration="3.28776331s" podCreationTimestamp="2026-02-19 09:13:58 +0000 UTC" firstStartedPulling="2026-02-19 09:13:59.212887677 +0000 UTC m=+1741.200899149" lastFinishedPulling="2026-02-19 09:14:00.338684193 +0000 UTC m=+1742.326695665" observedRunningTime="2026-02-19 09:14:01.283128376 +0000 UTC m=+1743.271139858" watchObservedRunningTime="2026-02-19 09:14:01.28776331 +0000 UTC m=+1743.275774782" Feb 19 09:14:10 crc kubenswrapper[4788]: I0219 09:14:10.714409 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:14:10 crc kubenswrapper[4788]: E0219 09:14:10.715141 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:14:15 crc kubenswrapper[4788]: I0219 09:14:15.038871 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t48sf"] Feb 19 09:14:15 crc kubenswrapper[4788]: I0219 09:14:15.066732 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t48sf"] Feb 19 09:14:16 crc kubenswrapper[4788]: I0219 09:14:16.741502 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd8890d-e06b-45e5-865b-838e036ac302" path="/var/lib/kubelet/pods/dbd8890d-e06b-45e5-865b-838e036ac302/volumes" Feb 19 09:14:21 crc kubenswrapper[4788]: I0219 09:14:21.715090 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:14:21 crc kubenswrapper[4788]: E0219 09:14:21.717172 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:14:22 crc kubenswrapper[4788]: I0219 09:14:22.846033 4788 scope.go:117] "RemoveContainer" containerID="5277c64d75c0cf2ad2f41d6b4441d76bf14236763e566f1da28ab2099fc982cb" Feb 19 09:14:22 crc kubenswrapper[4788]: I0219 09:14:22.895492 4788 scope.go:117] "RemoveContainer" containerID="9aafee9d4dccba11500c2598bea95a81d3ffd40a43dd81ab4d4a24604be220f5" Feb 19 09:14:22 crc kubenswrapper[4788]: I0219 09:14:22.933554 4788 scope.go:117] "RemoveContainer" containerID="645545c3d44a325aabfedef594b111d4f4f199768422929f115ef56de5621ed0" Feb 19 09:14:23 crc kubenswrapper[4788]: I0219 09:14:23.001781 4788 scope.go:117] "RemoveContainer" containerID="ab0295045488273318b481d0a14297364c23bb2062591a3491f59dc6b17a95e1" Feb 19 09:14:23 crc kubenswrapper[4788]: I0219 09:14:23.028440 4788 scope.go:117] "RemoveContainer" containerID="775500a2ee518a0719167f603a3bf25d996b96f34ab500016625d881ebfa0f19" Feb 19 09:14:23 crc kubenswrapper[4788]: I0219 09:14:23.127833 4788 scope.go:117] "RemoveContainer" containerID="5730e69c134dd7e97b3782f5cdf06025a8ece910a225f19f11d56b280425847b" Feb 19 09:14:23 crc kubenswrapper[4788]: I0219 09:14:23.195583 4788 scope.go:117] "RemoveContainer" containerID="a708a2d05af63e70ad0b55420f6da7bf33320ae6c612e91a990ec7a496d097d5" Feb 19 09:14:32 crc kubenswrapper[4788]: I0219 09:14:32.715176 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:14:32 crc kubenswrapper[4788]: E0219 09:14:32.716179 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:14:38 crc kubenswrapper[4788]: I0219 09:14:38.049853 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-hwbvj"] Feb 19 09:14:38 crc kubenswrapper[4788]: I0219 09:14:38.057555 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-hwbvj"] Feb 19 09:14:38 crc kubenswrapper[4788]: I0219 09:14:38.730270 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b48c414-8fa5-4654-b4c6-457650a816b4" path="/var/lib/kubelet/pods/2b48c414-8fa5-4654-b4c6-457650a816b4/volumes" Feb 19 09:14:43 crc kubenswrapper[4788]: I0219 09:14:43.638533 4788 generic.go:334] "Generic (PLEG): container finished" podID="fd3576f8-7d3f-464b-9bcd-b07fa37ef51e" containerID="53eb993fc320772255a3c35acd2d52b78ae47acdb327b76903d3f9272daa8095" exitCode=0 Feb 19 09:14:43 crc kubenswrapper[4788]: I0219 09:14:43.638604 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" event={"ID":"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e","Type":"ContainerDied","Data":"53eb993fc320772255a3c35acd2d52b78ae47acdb327b76903d3f9272daa8095"} Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.056500 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.188520 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-inventory\") pod \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.188700 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwft6\" (UniqueName: \"kubernetes.io/projected/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-kube-api-access-jwft6\") pod \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.188814 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-ssh-key-openstack-edpm-ipam\") pod \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\" (UID: \"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e\") " Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.194467 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-kube-api-access-jwft6" (OuterVolumeSpecName: "kube-api-access-jwft6") pod "fd3576f8-7d3f-464b-9bcd-b07fa37ef51e" (UID: "fd3576f8-7d3f-464b-9bcd-b07fa37ef51e"). InnerVolumeSpecName "kube-api-access-jwft6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.220550 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fd3576f8-7d3f-464b-9bcd-b07fa37ef51e" (UID: "fd3576f8-7d3f-464b-9bcd-b07fa37ef51e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.226128 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-inventory" (OuterVolumeSpecName: "inventory") pod "fd3576f8-7d3f-464b-9bcd-b07fa37ef51e" (UID: "fd3576f8-7d3f-464b-9bcd-b07fa37ef51e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.290772 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.290812 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.290821 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwft6\" (UniqueName: \"kubernetes.io/projected/fd3576f8-7d3f-464b-9bcd-b07fa37ef51e-kube-api-access-jwft6\") on node \"crc\" DevicePath \"\"" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.655707 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" event={"ID":"fd3576f8-7d3f-464b-9bcd-b07fa37ef51e","Type":"ContainerDied","Data":"0ae94ab3e811da34083f0da24c56755002b0f1d0f5170848952b2e761d0320ae"} Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.655994 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae94ab3e811da34083f0da24c56755002b0f1d0f5170848952b2e761d0320ae" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.655763 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rl87p" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.719882 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:14:45 crc kubenswrapper[4788]: E0219 09:14:45.720144 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.744394 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cx4ht"] Feb 19 09:14:45 crc kubenswrapper[4788]: E0219 09:14:45.744779 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd3576f8-7d3f-464b-9bcd-b07fa37ef51e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.744799 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd3576f8-7d3f-464b-9bcd-b07fa37ef51e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.744986 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd3576f8-7d3f-464b-9bcd-b07fa37ef51e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.745570 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.747529 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.747929 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.748105 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.748457 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.754852 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cx4ht"] Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.901496 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.902059 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqb68\" (UniqueName: \"kubernetes.io/projected/2317a010-4151-4577-804c-70f4e7fb2775-kube-api-access-tqb68\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:45 crc kubenswrapper[4788]: I0219 09:14:45.902124 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.003560 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.003736 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqb68\" (UniqueName: \"kubernetes.io/projected/2317a010-4151-4577-804c-70f4e7fb2775-kube-api-access-tqb68\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.003767 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.007965 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.008705 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.029236 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqb68\" (UniqueName: \"kubernetes.io/projected/2317a010-4151-4577-804c-70f4e7fb2775-kube-api-access-tqb68\") pod \"ssh-known-hosts-edpm-deployment-cx4ht\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.063788 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.589130 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cx4ht"] Feb 19 09:14:46 crc kubenswrapper[4788]: I0219 09:14:46.665278 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" event={"ID":"2317a010-4151-4577-804c-70f4e7fb2775","Type":"ContainerStarted","Data":"ce24a2551df28fc57e39fadc66460e71467f542764519e3ace9d2b128b52de90"} Feb 19 09:14:47 crc kubenswrapper[4788]: I0219 09:14:47.672487 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" event={"ID":"2317a010-4151-4577-804c-70f4e7fb2775","Type":"ContainerStarted","Data":"a74c39f746ba9a404b6d8dd8742facf27a0037e496c290d5b8faed2c67e8c6c1"} Feb 19 09:14:47 crc kubenswrapper[4788]: I0219 09:14:47.698116 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" podStartSLOduration=2.166681637 podStartE2EDuration="2.698098534s" podCreationTimestamp="2026-02-19 09:14:45 +0000 UTC" firstStartedPulling="2026-02-19 09:14:46.590191589 +0000 UTC m=+1788.578203061" lastFinishedPulling="2026-02-19 09:14:47.121608466 +0000 UTC m=+1789.109619958" observedRunningTime="2026-02-19 09:14:47.690906677 +0000 UTC m=+1789.678918169" watchObservedRunningTime="2026-02-19 09:14:47.698098534 +0000 UTC m=+1789.686110006" Feb 19 09:14:48 crc kubenswrapper[4788]: I0219 09:14:48.038795 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xcz8"] Feb 19 09:14:48 crc kubenswrapper[4788]: I0219 09:14:48.048265 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-9xcz8"] Feb 19 09:14:48 crc kubenswrapper[4788]: I0219 09:14:48.730261 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5fef27-0741-4f5a-9a12-fa6917cf16af" path="/var/lib/kubelet/pods/1b5fef27-0741-4f5a-9a12-fa6917cf16af/volumes" Feb 19 09:14:53 crc kubenswrapper[4788]: I0219 09:14:53.728789 4788 generic.go:334] "Generic (PLEG): container finished" podID="2317a010-4151-4577-804c-70f4e7fb2775" containerID="a74c39f746ba9a404b6d8dd8742facf27a0037e496c290d5b8faed2c67e8c6c1" exitCode=0 Feb 19 09:14:53 crc kubenswrapper[4788]: I0219 09:14:53.728892 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" event={"ID":"2317a010-4151-4577-804c-70f4e7fb2775","Type":"ContainerDied","Data":"a74c39f746ba9a404b6d8dd8742facf27a0037e496c290d5b8faed2c67e8c6c1"} Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.209948 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.397885 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqb68\" (UniqueName: \"kubernetes.io/projected/2317a010-4151-4577-804c-70f4e7fb2775-kube-api-access-tqb68\") pod \"2317a010-4151-4577-804c-70f4e7fb2775\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.398320 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-inventory-0\") pod \"2317a010-4151-4577-804c-70f4e7fb2775\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.398495 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-ssh-key-openstack-edpm-ipam\") pod \"2317a010-4151-4577-804c-70f4e7fb2775\" (UID: \"2317a010-4151-4577-804c-70f4e7fb2775\") " Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.405838 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2317a010-4151-4577-804c-70f4e7fb2775-kube-api-access-tqb68" (OuterVolumeSpecName: "kube-api-access-tqb68") pod "2317a010-4151-4577-804c-70f4e7fb2775" (UID: "2317a010-4151-4577-804c-70f4e7fb2775"). InnerVolumeSpecName "kube-api-access-tqb68". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.433171 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2317a010-4151-4577-804c-70f4e7fb2775" (UID: "2317a010-4151-4577-804c-70f4e7fb2775"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.434017 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "2317a010-4151-4577-804c-70f4e7fb2775" (UID: "2317a010-4151-4577-804c-70f4e7fb2775"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.502000 4788 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.502081 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2317a010-4151-4577-804c-70f4e7fb2775-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.502105 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqb68\" (UniqueName: \"kubernetes.io/projected/2317a010-4151-4577-804c-70f4e7fb2775-kube-api-access-tqb68\") on node \"crc\" DevicePath \"\"" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.751045 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" event={"ID":"2317a010-4151-4577-804c-70f4e7fb2775","Type":"ContainerDied","Data":"ce24a2551df28fc57e39fadc66460e71467f542764519e3ace9d2b128b52de90"} Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.751329 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce24a2551df28fc57e39fadc66460e71467f542764519e3ace9d2b128b52de90" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.751126 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cx4ht" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.831459 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p"] Feb 19 09:14:55 crc kubenswrapper[4788]: E0219 09:14:55.831854 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2317a010-4151-4577-804c-70f4e7fb2775" containerName="ssh-known-hosts-edpm-deployment" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.831871 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="2317a010-4151-4577-804c-70f4e7fb2775" containerName="ssh-known-hosts-edpm-deployment" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.832094 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="2317a010-4151-4577-804c-70f4e7fb2775" containerName="ssh-known-hosts-edpm-deployment" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.832928 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.834985 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.836793 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.836974 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.837066 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:14:55 crc kubenswrapper[4788]: I0219 09:14:55.844857 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p"] Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.011173 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4tzh\" (UniqueName: \"kubernetes.io/projected/5900ffba-e746-46d2-bb1a-07ea800e9ff5-kube-api-access-x4tzh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.011232 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.011276 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.113509 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4tzh\" (UniqueName: \"kubernetes.io/projected/5900ffba-e746-46d2-bb1a-07ea800e9ff5-kube-api-access-x4tzh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.113960 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.114047 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.118004 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.118376 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.130540 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4tzh\" (UniqueName: \"kubernetes.io/projected/5900ffba-e746-46d2-bb1a-07ea800e9ff5-kube-api-access-x4tzh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fbj5p\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.153264 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.689294 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p"] Feb 19 09:14:56 crc kubenswrapper[4788]: W0219 09:14:56.696311 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5900ffba_e746_46d2_bb1a_07ea800e9ff5.slice/crio-4e83336215a45a7ea981232ffa7fc3b69e99562189d04afcbe9f0330b3825d38 WatchSource:0}: Error finding container 4e83336215a45a7ea981232ffa7fc3b69e99562189d04afcbe9f0330b3825d38: Status 404 returned error can't find the container with id 4e83336215a45a7ea981232ffa7fc3b69e99562189d04afcbe9f0330b3825d38 Feb 19 09:14:56 crc kubenswrapper[4788]: I0219 09:14:56.759690 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" event={"ID":"5900ffba-e746-46d2-bb1a-07ea800e9ff5","Type":"ContainerStarted","Data":"4e83336215a45a7ea981232ffa7fc3b69e99562189d04afcbe9f0330b3825d38"} Feb 19 09:14:57 crc kubenswrapper[4788]: I0219 09:14:57.791938 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" event={"ID":"5900ffba-e746-46d2-bb1a-07ea800e9ff5","Type":"ContainerStarted","Data":"1e3791c25b3d9bcb945471566a4b5a4a8adb08a44f218ff9909cca47b8a2e79d"} Feb 19 09:14:57 crc kubenswrapper[4788]: I0219 09:14:57.815005 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" podStartSLOduration=2.334373504 podStartE2EDuration="2.814988278s" podCreationTimestamp="2026-02-19 09:14:55 +0000 UTC" firstStartedPulling="2026-02-19 09:14:56.6996787 +0000 UTC m=+1798.687690172" lastFinishedPulling="2026-02-19 09:14:57.180293464 +0000 UTC m=+1799.168304946" observedRunningTime="2026-02-19 09:14:57.809024441 +0000 UTC m=+1799.797035913" watchObservedRunningTime="2026-02-19 09:14:57.814988278 +0000 UTC m=+1799.802999740" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.139909 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp"] Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.142010 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.144836 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.147565 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.151626 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp"] Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.298354 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5d37f92-a930-4cf4-a516-95482b5ee38f-config-volume\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.298786 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64vrh\" (UniqueName: \"kubernetes.io/projected/a5d37f92-a930-4cf4-a516-95482b5ee38f-kube-api-access-64vrh\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.298852 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5d37f92-a930-4cf4-a516-95482b5ee38f-secret-volume\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.400419 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64vrh\" (UniqueName: \"kubernetes.io/projected/a5d37f92-a930-4cf4-a516-95482b5ee38f-kube-api-access-64vrh\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.400475 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5d37f92-a930-4cf4-a516-95482b5ee38f-secret-volume\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.400566 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5d37f92-a930-4cf4-a516-95482b5ee38f-config-volume\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.401706 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5d37f92-a930-4cf4-a516-95482b5ee38f-config-volume\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.412013 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5d37f92-a930-4cf4-a516-95482b5ee38f-secret-volume\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.432911 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64vrh\" (UniqueName: \"kubernetes.io/projected/a5d37f92-a930-4cf4-a516-95482b5ee38f-kube-api-access-64vrh\") pod \"collect-profiles-29524875-b6hpp\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.472046 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.715635 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:15:00 crc kubenswrapper[4788]: E0219 09:15:00.716627 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:15:00 crc kubenswrapper[4788]: I0219 09:15:00.929529 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp"] Feb 19 09:15:00 crc kubenswrapper[4788]: W0219 09:15:00.930276 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5d37f92_a930_4cf4_a516_95482b5ee38f.slice/crio-aca43a62dddec096fb4355a0e9d1982cc19aac72e94dfb7b46a002fba31380cc WatchSource:0}: Error finding container aca43a62dddec096fb4355a0e9d1982cc19aac72e94dfb7b46a002fba31380cc: Status 404 returned error can't find the container with id aca43a62dddec096fb4355a0e9d1982cc19aac72e94dfb7b46a002fba31380cc Feb 19 09:15:01 crc kubenswrapper[4788]: I0219 09:15:01.830724 4788 generic.go:334] "Generic (PLEG): container finished" podID="a5d37f92-a930-4cf4-a516-95482b5ee38f" containerID="e2c629b39fbcbf9a8642844bf8c16ad820171acb3bbf8ec25541bb7a5fbeed48" exitCode=0 Feb 19 09:15:01 crc kubenswrapper[4788]: I0219 09:15:01.830919 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" event={"ID":"a5d37f92-a930-4cf4-a516-95482b5ee38f","Type":"ContainerDied","Data":"e2c629b39fbcbf9a8642844bf8c16ad820171acb3bbf8ec25541bb7a5fbeed48"} Feb 19 09:15:01 crc kubenswrapper[4788]: I0219 09:15:01.831007 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" event={"ID":"a5d37f92-a930-4cf4-a516-95482b5ee38f","Type":"ContainerStarted","Data":"aca43a62dddec096fb4355a0e9d1982cc19aac72e94dfb7b46a002fba31380cc"} Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.181797 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.371090 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64vrh\" (UniqueName: \"kubernetes.io/projected/a5d37f92-a930-4cf4-a516-95482b5ee38f-kube-api-access-64vrh\") pod \"a5d37f92-a930-4cf4-a516-95482b5ee38f\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.371721 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5d37f92-a930-4cf4-a516-95482b5ee38f-secret-volume\") pod \"a5d37f92-a930-4cf4-a516-95482b5ee38f\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.372075 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5d37f92-a930-4cf4-a516-95482b5ee38f-config-volume\") pod \"a5d37f92-a930-4cf4-a516-95482b5ee38f\" (UID: \"a5d37f92-a930-4cf4-a516-95482b5ee38f\") " Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.372417 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5d37f92-a930-4cf4-a516-95482b5ee38f-config-volume" (OuterVolumeSpecName: "config-volume") pod "a5d37f92-a930-4cf4-a516-95482b5ee38f" (UID: "a5d37f92-a930-4cf4-a516-95482b5ee38f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.372894 4788 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a5d37f92-a930-4cf4-a516-95482b5ee38f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.377853 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5d37f92-a930-4cf4-a516-95482b5ee38f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a5d37f92-a930-4cf4-a516-95482b5ee38f" (UID: "a5d37f92-a930-4cf4-a516-95482b5ee38f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.380361 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5d37f92-a930-4cf4-a516-95482b5ee38f-kube-api-access-64vrh" (OuterVolumeSpecName: "kube-api-access-64vrh") pod "a5d37f92-a930-4cf4-a516-95482b5ee38f" (UID: "a5d37f92-a930-4cf4-a516-95482b5ee38f"). InnerVolumeSpecName "kube-api-access-64vrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.474677 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64vrh\" (UniqueName: \"kubernetes.io/projected/a5d37f92-a930-4cf4-a516-95482b5ee38f-kube-api-access-64vrh\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.474714 4788 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a5d37f92-a930-4cf4-a516-95482b5ee38f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.848166 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" event={"ID":"a5d37f92-a930-4cf4-a516-95482b5ee38f","Type":"ContainerDied","Data":"aca43a62dddec096fb4355a0e9d1982cc19aac72e94dfb7b46a002fba31380cc"} Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.848205 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aca43a62dddec096fb4355a0e9d1982cc19aac72e94dfb7b46a002fba31380cc" Feb 19 09:15:03 crc kubenswrapper[4788]: I0219 09:15:03.848209 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524875-b6hpp" Feb 19 09:15:04 crc kubenswrapper[4788]: I0219 09:15:04.857813 4788 generic.go:334] "Generic (PLEG): container finished" podID="5900ffba-e746-46d2-bb1a-07ea800e9ff5" containerID="1e3791c25b3d9bcb945471566a4b5a4a8adb08a44f218ff9909cca47b8a2e79d" exitCode=0 Feb 19 09:15:04 crc kubenswrapper[4788]: I0219 09:15:04.857949 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" event={"ID":"5900ffba-e746-46d2-bb1a-07ea800e9ff5","Type":"ContainerDied","Data":"1e3791c25b3d9bcb945471566a4b5a4a8adb08a44f218ff9909cca47b8a2e79d"} Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.264882 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.448126 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-inventory\") pod \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.448334 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4tzh\" (UniqueName: \"kubernetes.io/projected/5900ffba-e746-46d2-bb1a-07ea800e9ff5-kube-api-access-x4tzh\") pod \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.448412 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-ssh-key-openstack-edpm-ipam\") pod \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\" (UID: \"5900ffba-e746-46d2-bb1a-07ea800e9ff5\") " Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.454554 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5900ffba-e746-46d2-bb1a-07ea800e9ff5-kube-api-access-x4tzh" (OuterVolumeSpecName: "kube-api-access-x4tzh") pod "5900ffba-e746-46d2-bb1a-07ea800e9ff5" (UID: "5900ffba-e746-46d2-bb1a-07ea800e9ff5"). InnerVolumeSpecName "kube-api-access-x4tzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.477856 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5900ffba-e746-46d2-bb1a-07ea800e9ff5" (UID: "5900ffba-e746-46d2-bb1a-07ea800e9ff5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.483324 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-inventory" (OuterVolumeSpecName: "inventory") pod "5900ffba-e746-46d2-bb1a-07ea800e9ff5" (UID: "5900ffba-e746-46d2-bb1a-07ea800e9ff5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.550759 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4tzh\" (UniqueName: \"kubernetes.io/projected/5900ffba-e746-46d2-bb1a-07ea800e9ff5-kube-api-access-x4tzh\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.550795 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.550805 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5900ffba-e746-46d2-bb1a-07ea800e9ff5-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.886081 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" event={"ID":"5900ffba-e746-46d2-bb1a-07ea800e9ff5","Type":"ContainerDied","Data":"4e83336215a45a7ea981232ffa7fc3b69e99562189d04afcbe9f0330b3825d38"} Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.886135 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e83336215a45a7ea981232ffa7fc3b69e99562189d04afcbe9f0330b3825d38" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.886224 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fbj5p" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.973641 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg"] Feb 19 09:15:06 crc kubenswrapper[4788]: E0219 09:15:06.981799 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5900ffba-e746-46d2-bb1a-07ea800e9ff5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.981840 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="5900ffba-e746-46d2-bb1a-07ea800e9ff5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:06 crc kubenswrapper[4788]: E0219 09:15:06.981872 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5d37f92-a930-4cf4-a516-95482b5ee38f" containerName="collect-profiles" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.981881 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5d37f92-a930-4cf4-a516-95482b5ee38f" containerName="collect-profiles" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.982318 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="5900ffba-e746-46d2-bb1a-07ea800e9ff5" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.982348 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5d37f92-a930-4cf4-a516-95482b5ee38f" containerName="collect-profiles" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.988872 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.992926 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.992991 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.993107 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.993304 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:15:06 crc kubenswrapper[4788]: I0219 09:15:06.997048 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg"] Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.173802 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxxd2\" (UniqueName: \"kubernetes.io/projected/a925d651-e8e3-4436-b6b8-4894a550431f-kube-api-access-qxxd2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.173875 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.173938 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.276112 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxxd2\" (UniqueName: \"kubernetes.io/projected/a925d651-e8e3-4436-b6b8-4894a550431f-kube-api-access-qxxd2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.276164 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.276229 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.282453 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.283027 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.299109 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxxd2\" (UniqueName: \"kubernetes.io/projected/a925d651-e8e3-4436-b6b8-4894a550431f-kube-api-access-qxxd2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.317785 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.824803 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg"] Feb 19 09:15:07 crc kubenswrapper[4788]: I0219 09:15:07.899722 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" event={"ID":"a925d651-e8e3-4436-b6b8-4894a550431f","Type":"ContainerStarted","Data":"74e02e2565fd27ba6c70dd54c9ff7f094e71e8640f943e96842d40dc1e53532e"} Feb 19 09:15:08 crc kubenswrapper[4788]: I0219 09:15:08.912071 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" event={"ID":"a925d651-e8e3-4436-b6b8-4894a550431f","Type":"ContainerStarted","Data":"7a6d8ffca1f42af12dfc6e3b1819cd62c88e9bbe80a7b220314fa0f3355ae841"} Feb 19 09:15:08 crc kubenswrapper[4788]: I0219 09:15:08.925100 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" podStartSLOduration=2.4544391 podStartE2EDuration="2.925078528s" podCreationTimestamp="2026-02-19 09:15:06 +0000 UTC" firstStartedPulling="2026-02-19 09:15:07.830700106 +0000 UTC m=+1809.818711578" lastFinishedPulling="2026-02-19 09:15:08.301339534 +0000 UTC m=+1810.289351006" observedRunningTime="2026-02-19 09:15:08.924623406 +0000 UTC m=+1810.912634878" watchObservedRunningTime="2026-02-19 09:15:08.925078528 +0000 UTC m=+1810.913090000" Feb 19 09:15:13 crc kubenswrapper[4788]: I0219 09:15:13.715153 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:15:13 crc kubenswrapper[4788]: E0219 09:15:13.715482 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:15:18 crc kubenswrapper[4788]: I0219 09:15:18.020150 4788 generic.go:334] "Generic (PLEG): container finished" podID="a925d651-e8e3-4436-b6b8-4894a550431f" containerID="7a6d8ffca1f42af12dfc6e3b1819cd62c88e9bbe80a7b220314fa0f3355ae841" exitCode=0 Feb 19 09:15:18 crc kubenswrapper[4788]: I0219 09:15:18.020265 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" event={"ID":"a925d651-e8e3-4436-b6b8-4894a550431f","Type":"ContainerDied","Data":"7a6d8ffca1f42af12dfc6e3b1819cd62c88e9bbe80a7b220314fa0f3355ae841"} Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.481124 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.615964 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-inventory\") pod \"a925d651-e8e3-4436-b6b8-4894a550431f\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.616475 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxxd2\" (UniqueName: \"kubernetes.io/projected/a925d651-e8e3-4436-b6b8-4894a550431f-kube-api-access-qxxd2\") pod \"a925d651-e8e3-4436-b6b8-4894a550431f\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.616691 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-ssh-key-openstack-edpm-ipam\") pod \"a925d651-e8e3-4436-b6b8-4894a550431f\" (UID: \"a925d651-e8e3-4436-b6b8-4894a550431f\") " Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.621751 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a925d651-e8e3-4436-b6b8-4894a550431f-kube-api-access-qxxd2" (OuterVolumeSpecName: "kube-api-access-qxxd2") pod "a925d651-e8e3-4436-b6b8-4894a550431f" (UID: "a925d651-e8e3-4436-b6b8-4894a550431f"). InnerVolumeSpecName "kube-api-access-qxxd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.649030 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-inventory" (OuterVolumeSpecName: "inventory") pod "a925d651-e8e3-4436-b6b8-4894a550431f" (UID: "a925d651-e8e3-4436-b6b8-4894a550431f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.649805 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a925d651-e8e3-4436-b6b8-4894a550431f" (UID: "a925d651-e8e3-4436-b6b8-4894a550431f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.718664 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxxd2\" (UniqueName: \"kubernetes.io/projected/a925d651-e8e3-4436-b6b8-4894a550431f-kube-api-access-qxxd2\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.718703 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:19 crc kubenswrapper[4788]: I0219 09:15:19.718718 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a925d651-e8e3-4436-b6b8-4894a550431f-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.041957 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" event={"ID":"a925d651-e8e3-4436-b6b8-4894a550431f","Type":"ContainerDied","Data":"74e02e2565fd27ba6c70dd54c9ff7f094e71e8640f943e96842d40dc1e53532e"} Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.042459 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74e02e2565fd27ba6c70dd54c9ff7f094e71e8640f943e96842d40dc1e53532e" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.042067 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.170617 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms"] Feb 19 09:15:20 crc kubenswrapper[4788]: E0219 09:15:20.172268 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a925d651-e8e3-4436-b6b8-4894a550431f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.172299 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="a925d651-e8e3-4436-b6b8-4894a550431f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.172498 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="a925d651-e8e3-4436-b6b8-4894a550431f" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.173137 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.178749 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms"] Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.179568 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.179706 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.179664 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.179952 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.180219 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.180395 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.180491 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.180671 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.328594 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.328929 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.328959 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.328985 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329011 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329035 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329063 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329435 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329525 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pknxj\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-kube-api-access-pknxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329565 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329639 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329678 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329719 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.329841 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431517 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pknxj\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-kube-api-access-pknxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431573 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431621 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431640 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431670 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431727 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431750 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431788 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.431899 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.432440 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.432484 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.432512 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.432557 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.432597 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.442541 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.443398 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.445892 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.448128 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.449395 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.449917 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.450209 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.450572 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.451156 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.452346 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.453969 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.458795 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.464110 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pknxj\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-kube-api-access-pknxj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.464829 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-2vvms\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:20 crc kubenswrapper[4788]: I0219 09:15:20.499708 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:21 crc kubenswrapper[4788]: I0219 09:15:21.119604 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms"] Feb 19 09:15:22 crc kubenswrapper[4788]: I0219 09:15:22.067368 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" event={"ID":"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e","Type":"ContainerStarted","Data":"0512092f68344f3f6d36f9003e67b21730b11441994d29cb98f925f50530eefb"} Feb 19 09:15:23 crc kubenswrapper[4788]: I0219 09:15:23.078199 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" event={"ID":"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e","Type":"ContainerStarted","Data":"6c8a52cf339eda8865b0f12b1615407a53388bf93325759e7916af491f522751"} Feb 19 09:15:23 crc kubenswrapper[4788]: I0219 09:15:23.103866 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" podStartSLOduration=2.4211972680000002 podStartE2EDuration="3.103847551s" podCreationTimestamp="2026-02-19 09:15:20 +0000 UTC" firstStartedPulling="2026-02-19 09:15:21.128342254 +0000 UTC m=+1823.116353726" lastFinishedPulling="2026-02-19 09:15:21.810992497 +0000 UTC m=+1823.799004009" observedRunningTime="2026-02-19 09:15:23.100457716 +0000 UTC m=+1825.088469198" watchObservedRunningTime="2026-02-19 09:15:23.103847551 +0000 UTC m=+1825.091859023" Feb 19 09:15:23 crc kubenswrapper[4788]: I0219 09:15:23.383525 4788 scope.go:117] "RemoveContainer" containerID="5ae3e1f3005b2aa94da7d59bcd79e83657d3225550134345aa9ace61da634ba0" Feb 19 09:15:23 crc kubenswrapper[4788]: I0219 09:15:23.420375 4788 scope.go:117] "RemoveContainer" containerID="1e28682a2bb344feef2f7fa689eee510e022cba1dd36cd73f323b4403d744975" Feb 19 09:15:24 crc kubenswrapper[4788]: I0219 09:15:24.038702 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-kkt72"] Feb 19 09:15:24 crc kubenswrapper[4788]: I0219 09:15:24.047918 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-kkt72"] Feb 19 09:15:24 crc kubenswrapper[4788]: I0219 09:15:24.729095 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c8f267-83eb-4e22-9e99-78c3dc096823" path="/var/lib/kubelet/pods/88c8f267-83eb-4e22-9e99-78c3dc096823/volumes" Feb 19 09:15:25 crc kubenswrapper[4788]: I0219 09:15:25.714830 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:15:25 crc kubenswrapper[4788]: E0219 09:15:25.715073 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:15:39 crc kubenswrapper[4788]: I0219 09:15:39.714821 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:15:39 crc kubenswrapper[4788]: E0219 09:15:39.715926 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:15:52 crc kubenswrapper[4788]: I0219 09:15:52.714960 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:15:53 crc kubenswrapper[4788]: I0219 09:15:53.347406 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"d2c63370c54186978ab16c8be5655d7593a49e64407ae314b67deb51ba19c912"} Feb 19 09:15:55 crc kubenswrapper[4788]: I0219 09:15:55.365990 4788 generic.go:334] "Generic (PLEG): container finished" podID="cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" containerID="6c8a52cf339eda8865b0f12b1615407a53388bf93325759e7916af491f522751" exitCode=0 Feb 19 09:15:55 crc kubenswrapper[4788]: I0219 09:15:55.366107 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" event={"ID":"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e","Type":"ContainerDied","Data":"6c8a52cf339eda8865b0f12b1615407a53388bf93325759e7916af491f522751"} Feb 19 09:15:56 crc kubenswrapper[4788]: I0219 09:15:56.867063 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.002896 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-bootstrap-combined-ca-bundle\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003000 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-telemetry-combined-ca-bundle\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003025 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-nova-combined-ca-bundle\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003061 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ovn-combined-ca-bundle\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003116 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ssh-key-openstack-edpm-ipam\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003200 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003232 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-libvirt-combined-ca-bundle\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003279 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003299 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003337 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-inventory\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003368 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-neutron-metadata-combined-ca-bundle\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003482 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-repo-setup-combined-ca-bundle\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003539 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pknxj\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-kube-api-access-pknxj\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.003588 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\" (UID: \"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e\") " Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.012514 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.012878 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.013117 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.013211 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.013315 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.014137 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.015985 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.016113 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-kube-api-access-pknxj" (OuterVolumeSpecName: "kube-api-access-pknxj") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "kube-api-access-pknxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.016831 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.017764 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.017821 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.018692 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.041171 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.052929 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-inventory" (OuterVolumeSpecName: "inventory") pod "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" (UID: "cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105771 4788 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105838 4788 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105853 4788 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105870 4788 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105884 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105902 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105920 4788 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105940 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105955 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105970 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.105984 4788 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.106001 4788 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.106015 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pknxj\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-kube-api-access-pknxj\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.106031 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.386832 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" event={"ID":"cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e","Type":"ContainerDied","Data":"0512092f68344f3f6d36f9003e67b21730b11441994d29cb98f925f50530eefb"} Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.386886 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0512092f68344f3f6d36f9003e67b21730b11441994d29cb98f925f50530eefb" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.387310 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-2vvms" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.478383 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t"] Feb 19 09:15:57 crc kubenswrapper[4788]: E0219 09:15:57.478900 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.478923 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.479160 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.479981 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.483388 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.483854 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.483873 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.484330 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.486479 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.490193 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t"] Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.513961 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.514370 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.514434 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.514562 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0db811eb-116f-4653-94a4-467209ef8e49-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.514703 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxr5w\" (UniqueName: \"kubernetes.io/projected/0db811eb-116f-4653-94a4-467209ef8e49-kube-api-access-gxr5w\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.615789 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxr5w\" (UniqueName: \"kubernetes.io/projected/0db811eb-116f-4653-94a4-467209ef8e49-kube-api-access-gxr5w\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.615896 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.615954 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.616035 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.616915 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0db811eb-116f-4653-94a4-467209ef8e49-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.617853 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0db811eb-116f-4653-94a4-467209ef8e49-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.621978 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.623678 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.629806 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.633048 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxr5w\" (UniqueName: \"kubernetes.io/projected/0db811eb-116f-4653-94a4-467209ef8e49-kube-api-access-gxr5w\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-hj98t\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:57 crc kubenswrapper[4788]: I0219 09:15:57.798841 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:15:58 crc kubenswrapper[4788]: I0219 09:15:58.398947 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t"] Feb 19 09:15:58 crc kubenswrapper[4788]: W0219 09:15:58.405210 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice/crio-ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c WatchSource:0}: Error finding container ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c: Status 404 returned error can't find the container with id ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c Feb 19 09:15:58 crc kubenswrapper[4788]: I0219 09:15:58.407791 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:15:58 crc kubenswrapper[4788]: I0219 09:15:58.954224 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:15:59 crc kubenswrapper[4788]: I0219 09:15:59.403278 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" event={"ID":"0db811eb-116f-4653-94a4-467209ef8e49","Type":"ContainerStarted","Data":"aa240b9dc55d7affad1d5c64ea8f8b3f248d197e2f51394e78d6581bdb21566a"} Feb 19 09:15:59 crc kubenswrapper[4788]: I0219 09:15:59.403332 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" event={"ID":"0db811eb-116f-4653-94a4-467209ef8e49","Type":"ContainerStarted","Data":"ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c"} Feb 19 09:15:59 crc kubenswrapper[4788]: I0219 09:15:59.421859 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" podStartSLOduration=1.877436667 podStartE2EDuration="2.421835631s" podCreationTimestamp="2026-02-19 09:15:57 +0000 UTC" firstStartedPulling="2026-02-19 09:15:58.407597199 +0000 UTC m=+1860.395608671" lastFinishedPulling="2026-02-19 09:15:58.951996163 +0000 UTC m=+1860.940007635" observedRunningTime="2026-02-19 09:15:59.417691408 +0000 UTC m=+1861.405702900" watchObservedRunningTime="2026-02-19 09:15:59.421835631 +0000 UTC m=+1861.409847103" Feb 19 09:16:23 crc kubenswrapper[4788]: I0219 09:16:23.551560 4788 scope.go:117] "RemoveContainer" containerID="21d342bbc201906aca6674b1f66d888bdcd3ff4baec7beab73679a1108af5082" Feb 19 09:16:56 crc kubenswrapper[4788]: I0219 09:16:56.992936 4788 generic.go:334] "Generic (PLEG): container finished" podID="0db811eb-116f-4653-94a4-467209ef8e49" containerID="aa240b9dc55d7affad1d5c64ea8f8b3f248d197e2f51394e78d6581bdb21566a" exitCode=0 Feb 19 09:16:56 crc kubenswrapper[4788]: I0219 09:16:56.993044 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" event={"ID":"0db811eb-116f-4653-94a4-467209ef8e49","Type":"ContainerDied","Data":"aa240b9dc55d7affad1d5c64ea8f8b3f248d197e2f51394e78d6581bdb21566a"} Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.448557 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.570882 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0db811eb-116f-4653-94a4-467209ef8e49-ovncontroller-config-0\") pod \"0db811eb-116f-4653-94a4-467209ef8e49\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.571223 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxr5w\" (UniqueName: \"kubernetes.io/projected/0db811eb-116f-4653-94a4-467209ef8e49-kube-api-access-gxr5w\") pod \"0db811eb-116f-4653-94a4-467209ef8e49\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.571363 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-inventory\") pod \"0db811eb-116f-4653-94a4-467209ef8e49\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.571410 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ssh-key-openstack-edpm-ipam\") pod \"0db811eb-116f-4653-94a4-467209ef8e49\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.571436 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ovn-combined-ca-bundle\") pod \"0db811eb-116f-4653-94a4-467209ef8e49\" (UID: \"0db811eb-116f-4653-94a4-467209ef8e49\") " Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.579172 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0db811eb-116f-4653-94a4-467209ef8e49-kube-api-access-gxr5w" (OuterVolumeSpecName: "kube-api-access-gxr5w") pod "0db811eb-116f-4653-94a4-467209ef8e49" (UID: "0db811eb-116f-4653-94a4-467209ef8e49"). InnerVolumeSpecName "kube-api-access-gxr5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.580283 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0db811eb-116f-4653-94a4-467209ef8e49" (UID: "0db811eb-116f-4653-94a4-467209ef8e49"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.598723 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0db811eb-116f-4653-94a4-467209ef8e49-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "0db811eb-116f-4653-94a4-467209ef8e49" (UID: "0db811eb-116f-4653-94a4-467209ef8e49"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.607849 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-inventory" (OuterVolumeSpecName: "inventory") pod "0db811eb-116f-4653-94a4-467209ef8e49" (UID: "0db811eb-116f-4653-94a4-467209ef8e49"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.613577 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0db811eb-116f-4653-94a4-467209ef8e49" (UID: "0db811eb-116f-4653-94a4-467209ef8e49"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.673792 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.673838 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.673856 4788 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0db811eb-116f-4653-94a4-467209ef8e49-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.673870 4788 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0db811eb-116f-4653-94a4-467209ef8e49-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:16:58 crc kubenswrapper[4788]: I0219 09:16:58.673882 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxr5w\" (UniqueName: \"kubernetes.io/projected/0db811eb-116f-4653-94a4-467209ef8e49-kube-api-access-gxr5w\") on node \"crc\" DevicePath \"\"" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.012399 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" event={"ID":"0db811eb-116f-4653-94a4-467209ef8e49","Type":"ContainerDied","Data":"ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c"} Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.012444 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.012504 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-hj98t" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.146653 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk"] Feb 19 09:16:59 crc kubenswrapper[4788]: E0219 09:16:59.147170 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0db811eb-116f-4653-94a4-467209ef8e49" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.147191 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="0db811eb-116f-4653-94a4-467209ef8e49" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.147493 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="0db811eb-116f-4653-94a4-467209ef8e49" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.148384 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.156142 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.156475 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.156733 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.156469 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.156964 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk"] Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.156824 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.157436 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.289474 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.289620 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.289663 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vxzg\" (UniqueName: \"kubernetes.io/projected/b5eefb80-777b-4277-88f1-ac900e3d1b1f-kube-api-access-2vxzg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.289788 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.289847 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.289899 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.391655 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.391715 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.391736 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.391766 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.391848 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.391869 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxzg\" (UniqueName: \"kubernetes.io/projected/b5eefb80-777b-4277-88f1-ac900e3d1b1f-kube-api-access-2vxzg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.395893 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.396455 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.396892 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.397775 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.400123 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.408925 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vxzg\" (UniqueName: \"kubernetes.io/projected/b5eefb80-777b-4277-88f1-ac900e3d1b1f-kube-api-access-2vxzg\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.474341 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:16:59 crc kubenswrapper[4788]: I0219 09:16:59.839158 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk"] Feb 19 09:16:59 crc kubenswrapper[4788]: W0219 09:16:59.845066 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5eefb80_777b_4277_88f1_ac900e3d1b1f.slice/crio-70482b143bfe0b9242c6c2d419f8084332a101f5bad7a0f1bcd488cba504cd26 WatchSource:0}: Error finding container 70482b143bfe0b9242c6c2d419f8084332a101f5bad7a0f1bcd488cba504cd26: Status 404 returned error can't find the container with id 70482b143bfe0b9242c6c2d419f8084332a101f5bad7a0f1bcd488cba504cd26 Feb 19 09:17:00 crc kubenswrapper[4788]: I0219 09:17:00.024720 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" event={"ID":"b5eefb80-777b-4277-88f1-ac900e3d1b1f","Type":"ContainerStarted","Data":"70482b143bfe0b9242c6c2d419f8084332a101f5bad7a0f1bcd488cba504cd26"} Feb 19 09:17:01 crc kubenswrapper[4788]: I0219 09:17:01.038181 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" event={"ID":"b5eefb80-777b-4277-88f1-ac900e3d1b1f","Type":"ContainerStarted","Data":"7734fe28c4bd92fd292ed90bfdedf49048611b5da6923ea9b663bc9137ce568c"} Feb 19 09:17:08 crc kubenswrapper[4788]: E0219 09:17:08.550224 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice/crio-ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice\": RecentStats: unable to find data in memory cache]" Feb 19 09:17:18 crc kubenswrapper[4788]: E0219 09:17:18.818890 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice/crio-ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c\": RecentStats: unable to find data in memory cache]" Feb 19 09:17:23 crc kubenswrapper[4788]: I0219 09:17:23.673311 4788 scope.go:117] "RemoveContainer" containerID="6936dd575e2de31438bd3d20258de1db870952cf5817141bc86647cda84b8b5e" Feb 19 09:17:23 crc kubenswrapper[4788]: I0219 09:17:23.709761 4788 scope.go:117] "RemoveContainer" containerID="d2c5167064437ae34ed61c8e24fdfca9fcdbb450b58ec22a484d62448d575909" Feb 19 09:17:23 crc kubenswrapper[4788]: I0219 09:17:23.763495 4788 scope.go:117] "RemoveContainer" containerID="86a34cdbd701cb26cfbfe885a0e8e55ae375ba95e8b484a66a4191e1185117de" Feb 19 09:17:29 crc kubenswrapper[4788]: E0219 09:17:29.065645 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice/crio-ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice\": RecentStats: unable to find data in memory cache]" Feb 19 09:17:39 crc kubenswrapper[4788]: E0219 09:17:39.321294 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice/crio-ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice\": RecentStats: unable to find data in memory cache]" Feb 19 09:17:47 crc kubenswrapper[4788]: I0219 09:17:47.533638 4788 generic.go:334] "Generic (PLEG): container finished" podID="b5eefb80-777b-4277-88f1-ac900e3d1b1f" containerID="7734fe28c4bd92fd292ed90bfdedf49048611b5da6923ea9b663bc9137ce568c" exitCode=0 Feb 19 09:17:47 crc kubenswrapper[4788]: I0219 09:17:47.533752 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" event={"ID":"b5eefb80-777b-4277-88f1-ac900e3d1b1f","Type":"ContainerDied","Data":"7734fe28c4bd92fd292ed90bfdedf49048611b5da6923ea9b663bc9137ce568c"} Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.014870 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.166786 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-inventory\") pod \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.166969 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vxzg\" (UniqueName: \"kubernetes.io/projected/b5eefb80-777b-4277-88f1-ac900e3d1b1f-kube-api-access-2vxzg\") pod \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.167027 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-nova-metadata-neutron-config-0\") pod \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.167106 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-ssh-key-openstack-edpm-ipam\") pod \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.167184 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.167357 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-metadata-combined-ca-bundle\") pod \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\" (UID: \"b5eefb80-777b-4277-88f1-ac900e3d1b1f\") " Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.174632 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b5eefb80-777b-4277-88f1-ac900e3d1b1f" (UID: "b5eefb80-777b-4277-88f1-ac900e3d1b1f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.175653 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5eefb80-777b-4277-88f1-ac900e3d1b1f-kube-api-access-2vxzg" (OuterVolumeSpecName: "kube-api-access-2vxzg") pod "b5eefb80-777b-4277-88f1-ac900e3d1b1f" (UID: "b5eefb80-777b-4277-88f1-ac900e3d1b1f"). InnerVolumeSpecName "kube-api-access-2vxzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.199987 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "b5eefb80-777b-4277-88f1-ac900e3d1b1f" (UID: "b5eefb80-777b-4277-88f1-ac900e3d1b1f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.217463 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-inventory" (OuterVolumeSpecName: "inventory") pod "b5eefb80-777b-4277-88f1-ac900e3d1b1f" (UID: "b5eefb80-777b-4277-88f1-ac900e3d1b1f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.221309 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "b5eefb80-777b-4277-88f1-ac900e3d1b1f" (UID: "b5eefb80-777b-4277-88f1-ac900e3d1b1f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.229359 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b5eefb80-777b-4277-88f1-ac900e3d1b1f" (UID: "b5eefb80-777b-4277-88f1-ac900e3d1b1f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.269817 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.269852 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vxzg\" (UniqueName: \"kubernetes.io/projected/b5eefb80-777b-4277-88f1-ac900e3d1b1f-kube-api-access-2vxzg\") on node \"crc\" DevicePath \"\"" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.269864 4788 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.269872 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.269883 4788 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.269893 4788 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5eefb80-777b-4277-88f1-ac900e3d1b1f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:17:49 crc kubenswrapper[4788]: E0219 09:17:49.539445 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice/crio-ba4992f7161db4a2a459d334ec841f29b1d95095b7971752559f5912def5448c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0db811eb_116f_4653_94a4_467209ef8e49.slice\": RecentStats: unable to find data in memory cache]" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.552607 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" event={"ID":"b5eefb80-777b-4277-88f1-ac900e3d1b1f","Type":"ContainerDied","Data":"70482b143bfe0b9242c6c2d419f8084332a101f5bad7a0f1bcd488cba504cd26"} Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.552655 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70482b143bfe0b9242c6c2d419f8084332a101f5bad7a0f1bcd488cba504cd26" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.552706 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.657207 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp"] Feb 19 09:17:49 crc kubenswrapper[4788]: E0219 09:17:49.658074 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5eefb80-777b-4277-88f1-ac900e3d1b1f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.658192 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5eefb80-777b-4277-88f1-ac900e3d1b1f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.658604 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5eefb80-777b-4277-88f1-ac900e3d1b1f" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.659798 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.662202 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.662418 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.662560 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.662907 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.663287 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.668490 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp"] Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.787937 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.787974 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.787993 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.788080 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nbpm\" (UniqueName: \"kubernetes.io/projected/b6e56cc4-a944-431d-988d-a29bb84b7d04-kube-api-access-8nbpm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.788104 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.890561 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nbpm\" (UniqueName: \"kubernetes.io/projected/b6e56cc4-a944-431d-988d-a29bb84b7d04-kube-api-access-8nbpm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.890631 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.890842 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.890867 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.890889 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.895140 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.895414 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.895991 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.896021 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.909486 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nbpm\" (UniqueName: \"kubernetes.io/projected/b6e56cc4-a944-431d-988d-a29bb84b7d04-kube-api-access-8nbpm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-52lpp\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:49 crc kubenswrapper[4788]: I0219 09:17:49.980638 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:17:50 crc kubenswrapper[4788]: I0219 09:17:50.520081 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp"] Feb 19 09:17:50 crc kubenswrapper[4788]: W0219 09:17:50.522927 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6e56cc4_a944_431d_988d_a29bb84b7d04.slice/crio-4107c029f12909ddfb80ca4b843828261b308fd2ad33af02bf315a13667082c2 WatchSource:0}: Error finding container 4107c029f12909ddfb80ca4b843828261b308fd2ad33af02bf315a13667082c2: Status 404 returned error can't find the container with id 4107c029f12909ddfb80ca4b843828261b308fd2ad33af02bf315a13667082c2 Feb 19 09:17:50 crc kubenswrapper[4788]: I0219 09:17:50.566529 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" event={"ID":"b6e56cc4-a944-431d-988d-a29bb84b7d04","Type":"ContainerStarted","Data":"4107c029f12909ddfb80ca4b843828261b308fd2ad33af02bf315a13667082c2"} Feb 19 09:17:51 crc kubenswrapper[4788]: I0219 09:17:51.577701 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" event={"ID":"b6e56cc4-a944-431d-988d-a29bb84b7d04","Type":"ContainerStarted","Data":"659d2c49fdd0c1e5a552ded56a882dbc13b8ea3b202ce06c8b73aec74ad68aa9"} Feb 19 09:17:51 crc kubenswrapper[4788]: I0219 09:17:51.604653 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" podStartSLOduration=2.170187207 podStartE2EDuration="2.604631498s" podCreationTimestamp="2026-02-19 09:17:49 +0000 UTC" firstStartedPulling="2026-02-19 09:17:50.526294896 +0000 UTC m=+1972.514306368" lastFinishedPulling="2026-02-19 09:17:50.960739167 +0000 UTC m=+1972.948750659" observedRunningTime="2026-02-19 09:17:51.594762142 +0000 UTC m=+1973.582773634" watchObservedRunningTime="2026-02-19 09:17:51.604631498 +0000 UTC m=+1973.592642970" Feb 19 09:17:52 crc kubenswrapper[4788]: I0219 09:17:52.140173 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:17:52 crc kubenswrapper[4788]: I0219 09:17:52.140231 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:18:22 crc kubenswrapper[4788]: I0219 09:18:22.140052 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:18:22 crc kubenswrapper[4788]: I0219 09:18:22.141211 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:18:52 crc kubenswrapper[4788]: I0219 09:18:52.138954 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:18:52 crc kubenswrapper[4788]: I0219 09:18:52.139547 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:18:52 crc kubenswrapper[4788]: I0219 09:18:52.139603 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:18:52 crc kubenswrapper[4788]: I0219 09:18:52.140474 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2c63370c54186978ab16c8be5655d7593a49e64407ae314b67deb51ba19c912"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:18:52 crc kubenswrapper[4788]: I0219 09:18:52.140528 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://d2c63370c54186978ab16c8be5655d7593a49e64407ae314b67deb51ba19c912" gracePeriod=600 Feb 19 09:18:53 crc kubenswrapper[4788]: I0219 09:18:53.238785 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="d2c63370c54186978ab16c8be5655d7593a49e64407ae314b67deb51ba19c912" exitCode=0 Feb 19 09:18:53 crc kubenswrapper[4788]: I0219 09:18:53.238856 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"d2c63370c54186978ab16c8be5655d7593a49e64407ae314b67deb51ba19c912"} Feb 19 09:18:53 crc kubenswrapper[4788]: I0219 09:18:53.239472 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91"} Feb 19 09:18:53 crc kubenswrapper[4788]: I0219 09:18:53.239493 4788 scope.go:117] "RemoveContainer" containerID="845d8228fc09f6ca29ad70e9d75c3716494e502ffcba19f5521f72520e91eb81" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.122062 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zpgjs"] Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.125144 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.142817 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zpgjs"] Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.193788 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv469\" (UniqueName: \"kubernetes.io/projected/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-kube-api-access-bv469\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.193941 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-utilities\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.193982 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-catalog-content\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.294921 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv469\" (UniqueName: \"kubernetes.io/projected/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-kube-api-access-bv469\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.295028 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-utilities\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.295057 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-catalog-content\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.295626 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-utilities\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.295640 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-catalog-content\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.321132 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv469\" (UniqueName: \"kubernetes.io/projected/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-kube-api-access-bv469\") pod \"community-operators-zpgjs\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.445950 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:47 crc kubenswrapper[4788]: I0219 09:20:47.998004 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zpgjs"] Feb 19 09:20:48 crc kubenswrapper[4788]: I0219 09:20:48.338082 4788 generic.go:334] "Generic (PLEG): container finished" podID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerID="3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1" exitCode=0 Feb 19 09:20:48 crc kubenswrapper[4788]: I0219 09:20:48.338126 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zpgjs" event={"ID":"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79","Type":"ContainerDied","Data":"3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1"} Feb 19 09:20:48 crc kubenswrapper[4788]: I0219 09:20:48.338153 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zpgjs" event={"ID":"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79","Type":"ContainerStarted","Data":"40dbb95de347d3f59bc802ae79416bf8b2061cc8715a35033786e5d36ed21d35"} Feb 19 09:20:52 crc kubenswrapper[4788]: I0219 09:20:52.139818 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:20:52 crc kubenswrapper[4788]: I0219 09:20:52.140659 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:20:52 crc kubenswrapper[4788]: I0219 09:20:52.377547 4788 generic.go:334] "Generic (PLEG): container finished" podID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerID="777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d" exitCode=0 Feb 19 09:20:52 crc kubenswrapper[4788]: I0219 09:20:52.377596 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zpgjs" event={"ID":"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79","Type":"ContainerDied","Data":"777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d"} Feb 19 09:20:53 crc kubenswrapper[4788]: I0219 09:20:53.388431 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zpgjs" event={"ID":"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79","Type":"ContainerStarted","Data":"24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66"} Feb 19 09:20:53 crc kubenswrapper[4788]: I0219 09:20:53.408649 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zpgjs" podStartSLOduration=1.987486589 podStartE2EDuration="6.408625083s" podCreationTimestamp="2026-02-19 09:20:47 +0000 UTC" firstStartedPulling="2026-02-19 09:20:48.339824017 +0000 UTC m=+2150.327835499" lastFinishedPulling="2026-02-19 09:20:52.760962521 +0000 UTC m=+2154.748973993" observedRunningTime="2026-02-19 09:20:53.402725263 +0000 UTC m=+2155.390736735" watchObservedRunningTime="2026-02-19 09:20:53.408625083 +0000 UTC m=+2155.396636555" Feb 19 09:20:57 crc kubenswrapper[4788]: I0219 09:20:57.446898 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:57 crc kubenswrapper[4788]: I0219 09:20:57.447423 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:57 crc kubenswrapper[4788]: I0219 09:20:57.496198 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:58 crc kubenswrapper[4788]: I0219 09:20:58.474029 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:20:58 crc kubenswrapper[4788]: I0219 09:20:58.523343 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zpgjs"] Feb 19 09:21:00 crc kubenswrapper[4788]: I0219 09:21:00.449034 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zpgjs" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerName="registry-server" containerID="cri-o://24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66" gracePeriod=2 Feb 19 09:21:00 crc kubenswrapper[4788]: I0219 09:21:00.977906 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.092728 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-utilities\") pod \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.093178 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv469\" (UniqueName: \"kubernetes.io/projected/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-kube-api-access-bv469\") pod \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.093372 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-catalog-content\") pod \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\" (UID: \"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79\") " Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.093948 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-utilities" (OuterVolumeSpecName: "utilities") pod "c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" (UID: "c39eeced-7b59-4cef-9ec6-8fc48fc3ae79"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.094128 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.098033 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-kube-api-access-bv469" (OuterVolumeSpecName: "kube-api-access-bv469") pod "c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" (UID: "c39eeced-7b59-4cef-9ec6-8fc48fc3ae79"). InnerVolumeSpecName "kube-api-access-bv469". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.148222 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" (UID: "c39eeced-7b59-4cef-9ec6-8fc48fc3ae79"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.196451 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.196495 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv469\" (UniqueName: \"kubernetes.io/projected/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79-kube-api-access-bv469\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.466472 4788 generic.go:334] "Generic (PLEG): container finished" podID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerID="24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66" exitCode=0 Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.466599 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zpgjs" event={"ID":"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79","Type":"ContainerDied","Data":"24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66"} Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.466614 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zpgjs" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.468004 4788 scope.go:117] "RemoveContainer" containerID="24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.468098 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zpgjs" event={"ID":"c39eeced-7b59-4cef-9ec6-8fc48fc3ae79","Type":"ContainerDied","Data":"40dbb95de347d3f59bc802ae79416bf8b2061cc8715a35033786e5d36ed21d35"} Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.516085 4788 scope.go:117] "RemoveContainer" containerID="777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.540381 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zpgjs"] Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.560905 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zpgjs"] Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.564941 4788 scope.go:117] "RemoveContainer" containerID="3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.626838 4788 scope.go:117] "RemoveContainer" containerID="24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66" Feb 19 09:21:01 crc kubenswrapper[4788]: E0219 09:21:01.627557 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66\": container with ID starting with 24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66 not found: ID does not exist" containerID="24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.627615 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66"} err="failed to get container status \"24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66\": rpc error: code = NotFound desc = could not find container \"24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66\": container with ID starting with 24d12d6c6f20e49ef4ae20c910364c63d07ba237d374398770feff6fac7e3b66 not found: ID does not exist" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.627655 4788 scope.go:117] "RemoveContainer" containerID="777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d" Feb 19 09:21:01 crc kubenswrapper[4788]: E0219 09:21:01.628318 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d\": container with ID starting with 777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d not found: ID does not exist" containerID="777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.628364 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d"} err="failed to get container status \"777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d\": rpc error: code = NotFound desc = could not find container \"777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d\": container with ID starting with 777b2bef49033441f0a08d99c7050438710e7f4736e6c0ba3fee219780dbb97d not found: ID does not exist" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.628391 4788 scope.go:117] "RemoveContainer" containerID="3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1" Feb 19 09:21:01 crc kubenswrapper[4788]: E0219 09:21:01.628865 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1\": container with ID starting with 3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1 not found: ID does not exist" containerID="3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1" Feb 19 09:21:01 crc kubenswrapper[4788]: I0219 09:21:01.628949 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1"} err="failed to get container status \"3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1\": rpc error: code = NotFound desc = could not find container \"3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1\": container with ID starting with 3f22fea2f169105bb3803a41b1cb0b76617246660eef10641aeb7836140f7ba1 not found: ID does not exist" Feb 19 09:21:02 crc kubenswrapper[4788]: I0219 09:21:02.733681 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" path="/var/lib/kubelet/pods/c39eeced-7b59-4cef-9ec6-8fc48fc3ae79/volumes" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.138823 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.139413 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.517650 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fnr92"] Feb 19 09:21:22 crc kubenswrapper[4788]: E0219 09:21:22.518113 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerName="extract-content" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.518134 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerName="extract-content" Feb 19 09:21:22 crc kubenswrapper[4788]: E0219 09:21:22.518155 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerName="extract-utilities" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.518165 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerName="extract-utilities" Feb 19 09:21:22 crc kubenswrapper[4788]: E0219 09:21:22.518188 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerName="registry-server" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.518197 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerName="registry-server" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.518494 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39eeced-7b59-4cef-9ec6-8fc48fc3ae79" containerName="registry-server" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.520073 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.543011 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnr92"] Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.710387 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6b7p\" (UniqueName: \"kubernetes.io/projected/915abe52-d3da-4b57-ac55-20b48462941f-kube-api-access-g6b7p\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.710473 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-utilities\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.711960 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-catalog-content\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.817035 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6b7p\" (UniqueName: \"kubernetes.io/projected/915abe52-d3da-4b57-ac55-20b48462941f-kube-api-access-g6b7p\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.817151 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-utilities\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.817288 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-catalog-content\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.817971 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-catalog-content\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.817971 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-utilities\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.846845 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6b7p\" (UniqueName: \"kubernetes.io/projected/915abe52-d3da-4b57-ac55-20b48462941f-kube-api-access-g6b7p\") pod \"certified-operators-fnr92\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:22 crc kubenswrapper[4788]: I0219 09:21:22.856450 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:23 crc kubenswrapper[4788]: I0219 09:21:23.367587 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnr92"] Feb 19 09:21:23 crc kubenswrapper[4788]: I0219 09:21:23.669899 4788 generic.go:334] "Generic (PLEG): container finished" podID="915abe52-d3da-4b57-ac55-20b48462941f" containerID="e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07" exitCode=0 Feb 19 09:21:23 crc kubenswrapper[4788]: I0219 09:21:23.669974 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr92" event={"ID":"915abe52-d3da-4b57-ac55-20b48462941f","Type":"ContainerDied","Data":"e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07"} Feb 19 09:21:23 crc kubenswrapper[4788]: I0219 09:21:23.670131 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr92" event={"ID":"915abe52-d3da-4b57-ac55-20b48462941f","Type":"ContainerStarted","Data":"ffb281ede8d0f8c0fae18d4fcfbdf88d2849166a94a4dfd82470cf45094fe4d6"} Feb 19 09:21:23 crc kubenswrapper[4788]: I0219 09:21:23.672356 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:21:24 crc kubenswrapper[4788]: I0219 09:21:24.680621 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr92" event={"ID":"915abe52-d3da-4b57-ac55-20b48462941f","Type":"ContainerStarted","Data":"9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca"} Feb 19 09:21:25 crc kubenswrapper[4788]: I0219 09:21:25.689631 4788 generic.go:334] "Generic (PLEG): container finished" podID="915abe52-d3da-4b57-ac55-20b48462941f" containerID="9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca" exitCode=0 Feb 19 09:21:25 crc kubenswrapper[4788]: I0219 09:21:25.689725 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr92" event={"ID":"915abe52-d3da-4b57-ac55-20b48462941f","Type":"ContainerDied","Data":"9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca"} Feb 19 09:21:26 crc kubenswrapper[4788]: I0219 09:21:26.705044 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr92" event={"ID":"915abe52-d3da-4b57-ac55-20b48462941f","Type":"ContainerStarted","Data":"084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb"} Feb 19 09:21:29 crc kubenswrapper[4788]: I0219 09:21:29.731819 4788 generic.go:334] "Generic (PLEG): container finished" podID="b6e56cc4-a944-431d-988d-a29bb84b7d04" containerID="659d2c49fdd0c1e5a552ded56a882dbc13b8ea3b202ce06c8b73aec74ad68aa9" exitCode=0 Feb 19 09:21:29 crc kubenswrapper[4788]: I0219 09:21:29.731904 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" event={"ID":"b6e56cc4-a944-431d-988d-a29bb84b7d04","Type":"ContainerDied","Data":"659d2c49fdd0c1e5a552ded56a882dbc13b8ea3b202ce06c8b73aec74ad68aa9"} Feb 19 09:21:29 crc kubenswrapper[4788]: I0219 09:21:29.754863 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fnr92" podStartSLOduration=5.34485699 podStartE2EDuration="7.754845728s" podCreationTimestamp="2026-02-19 09:21:22 +0000 UTC" firstStartedPulling="2026-02-19 09:21:23.672043791 +0000 UTC m=+2185.660055263" lastFinishedPulling="2026-02-19 09:21:26.082032529 +0000 UTC m=+2188.070044001" observedRunningTime="2026-02-19 09:21:26.722641779 +0000 UTC m=+2188.710653261" watchObservedRunningTime="2026-02-19 09:21:29.754845728 +0000 UTC m=+2191.742857200" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.217161 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.308894 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-inventory\") pod \"b6e56cc4-a944-431d-988d-a29bb84b7d04\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.309141 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-secret-0\") pod \"b6e56cc4-a944-431d-988d-a29bb84b7d04\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.309196 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-ssh-key-openstack-edpm-ipam\") pod \"b6e56cc4-a944-431d-988d-a29bb84b7d04\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.309234 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-combined-ca-bundle\") pod \"b6e56cc4-a944-431d-988d-a29bb84b7d04\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.309305 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nbpm\" (UniqueName: \"kubernetes.io/projected/b6e56cc4-a944-431d-988d-a29bb84b7d04-kube-api-access-8nbpm\") pod \"b6e56cc4-a944-431d-988d-a29bb84b7d04\" (UID: \"b6e56cc4-a944-431d-988d-a29bb84b7d04\") " Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.316350 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e56cc4-a944-431d-988d-a29bb84b7d04-kube-api-access-8nbpm" (OuterVolumeSpecName: "kube-api-access-8nbpm") pod "b6e56cc4-a944-431d-988d-a29bb84b7d04" (UID: "b6e56cc4-a944-431d-988d-a29bb84b7d04"). InnerVolumeSpecName "kube-api-access-8nbpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.316391 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b6e56cc4-a944-431d-988d-a29bb84b7d04" (UID: "b6e56cc4-a944-431d-988d-a29bb84b7d04"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.340686 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b6e56cc4-a944-431d-988d-a29bb84b7d04" (UID: "b6e56cc4-a944-431d-988d-a29bb84b7d04"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.340825 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b6e56cc4-a944-431d-988d-a29bb84b7d04" (UID: "b6e56cc4-a944-431d-988d-a29bb84b7d04"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.342581 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-inventory" (OuterVolumeSpecName: "inventory") pod "b6e56cc4-a944-431d-988d-a29bb84b7d04" (UID: "b6e56cc4-a944-431d-988d-a29bb84b7d04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.410515 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nbpm\" (UniqueName: \"kubernetes.io/projected/b6e56cc4-a944-431d-988d-a29bb84b7d04-kube-api-access-8nbpm\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.410549 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.410559 4788 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.410569 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.410577 4788 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e56cc4-a944-431d-988d-a29bb84b7d04-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.749990 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" event={"ID":"b6e56cc4-a944-431d-988d-a29bb84b7d04","Type":"ContainerDied","Data":"4107c029f12909ddfb80ca4b843828261b308fd2ad33af02bf315a13667082c2"} Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.750029 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4107c029f12909ddfb80ca4b843828261b308fd2ad33af02bf315a13667082c2" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.750071 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-52lpp" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.844045 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4"] Feb 19 09:21:31 crc kubenswrapper[4788]: E0219 09:21:31.844707 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e56cc4-a944-431d-988d-a29bb84b7d04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.844729 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e56cc4-a944-431d-988d-a29bb84b7d04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.844939 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e56cc4-a944-431d-988d-a29bb84b7d04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.846704 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.849759 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.849814 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.851186 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.851485 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.852275 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.852453 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.857365 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4"] Feb 19 09:21:31 crc kubenswrapper[4788]: I0219 09:21:31.863014 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.021597 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.021678 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022010 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022085 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9blnz\" (UniqueName: \"kubernetes.io/projected/015b4a74-0341-4e84-862c-d627e79f1318-kube-api-access-9blnz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022114 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022181 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022325 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022435 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022496 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022535 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/015b4a74-0341-4e84-862c-d627e79f1318-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.022585 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124066 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124115 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124140 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/015b4a74-0341-4e84-862c-d627e79f1318-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124164 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124189 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124237 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124361 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124397 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9blnz\" (UniqueName: \"kubernetes.io/projected/015b4a74-0341-4e84-862c-d627e79f1318-kube-api-access-9blnz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124419 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124456 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.124897 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.125802 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/015b4a74-0341-4e84-862c-d627e79f1318-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.129301 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.129901 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.129993 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.130283 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.130506 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.131775 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.132583 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.132711 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.136757 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.147581 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9blnz\" (UniqueName: \"kubernetes.io/projected/015b4a74-0341-4e84-862c-d627e79f1318-kube-api-access-9blnz\") pod \"nova-edpm-deployment-openstack-edpm-ipam-l5jz4\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.175330 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.728369 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4"] Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.761714 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" event={"ID":"015b4a74-0341-4e84-862c-d627e79f1318","Type":"ContainerStarted","Data":"b39f9cf86605e5050d9edd274c4d061e5a4275559e3c59639dd6d020dc68719b"} Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.857577 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.857661 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:32 crc kubenswrapper[4788]: I0219 09:21:32.904389 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:33 crc kubenswrapper[4788]: I0219 09:21:33.824809 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:33 crc kubenswrapper[4788]: I0219 09:21:33.897068 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnr92"] Feb 19 09:21:34 crc kubenswrapper[4788]: I0219 09:21:34.786911 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" event={"ID":"015b4a74-0341-4e84-862c-d627e79f1318","Type":"ContainerStarted","Data":"af38f66766abe4723ba3de3d20cb774f00e35bb0097262d58f23bfbc0cc8692d"} Feb 19 09:21:34 crc kubenswrapper[4788]: I0219 09:21:34.810229 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" podStartSLOduration=2.698959989 podStartE2EDuration="3.810206388s" podCreationTimestamp="2026-02-19 09:21:31 +0000 UTC" firstStartedPulling="2026-02-19 09:21:32.727582414 +0000 UTC m=+2194.715593886" lastFinishedPulling="2026-02-19 09:21:33.838828813 +0000 UTC m=+2195.826840285" observedRunningTime="2026-02-19 09:21:34.807827374 +0000 UTC m=+2196.795838876" watchObservedRunningTime="2026-02-19 09:21:34.810206388 +0000 UTC m=+2196.798217870" Feb 19 09:21:35 crc kubenswrapper[4788]: I0219 09:21:35.793613 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fnr92" podUID="915abe52-d3da-4b57-ac55-20b48462941f" containerName="registry-server" containerID="cri-o://084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb" gracePeriod=2 Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.588158 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.641901 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-catalog-content\") pod \"915abe52-d3da-4b57-ac55-20b48462941f\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.642091 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-utilities\") pod \"915abe52-d3da-4b57-ac55-20b48462941f\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.642122 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6b7p\" (UniqueName: \"kubernetes.io/projected/915abe52-d3da-4b57-ac55-20b48462941f-kube-api-access-g6b7p\") pod \"915abe52-d3da-4b57-ac55-20b48462941f\" (UID: \"915abe52-d3da-4b57-ac55-20b48462941f\") " Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.650765 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-utilities" (OuterVolumeSpecName: "utilities") pod "915abe52-d3da-4b57-ac55-20b48462941f" (UID: "915abe52-d3da-4b57-ac55-20b48462941f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.673814 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915abe52-d3da-4b57-ac55-20b48462941f-kube-api-access-g6b7p" (OuterVolumeSpecName: "kube-api-access-g6b7p") pod "915abe52-d3da-4b57-ac55-20b48462941f" (UID: "915abe52-d3da-4b57-ac55-20b48462941f"). InnerVolumeSpecName "kube-api-access-g6b7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.691982 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "915abe52-d3da-4b57-ac55-20b48462941f" (UID: "915abe52-d3da-4b57-ac55-20b48462941f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.744802 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.744851 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6b7p\" (UniqueName: \"kubernetes.io/projected/915abe52-d3da-4b57-ac55-20b48462941f-kube-api-access-g6b7p\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.744862 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/915abe52-d3da-4b57-ac55-20b48462941f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.803457 4788 generic.go:334] "Generic (PLEG): container finished" podID="915abe52-d3da-4b57-ac55-20b48462941f" containerID="084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb" exitCode=0 Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.803506 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr92" event={"ID":"915abe52-d3da-4b57-ac55-20b48462941f","Type":"ContainerDied","Data":"084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb"} Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.803529 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnr92" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.803541 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnr92" event={"ID":"915abe52-d3da-4b57-ac55-20b48462941f","Type":"ContainerDied","Data":"ffb281ede8d0f8c0fae18d4fcfbdf88d2849166a94a4dfd82470cf45094fe4d6"} Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.803578 4788 scope.go:117] "RemoveContainer" containerID="084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.835689 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnr92"] Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.843912 4788 scope.go:117] "RemoveContainer" containerID="9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.844255 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fnr92"] Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.865197 4788 scope.go:117] "RemoveContainer" containerID="e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.907891 4788 scope.go:117] "RemoveContainer" containerID="084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb" Feb 19 09:21:36 crc kubenswrapper[4788]: E0219 09:21:36.908434 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb\": container with ID starting with 084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb not found: ID does not exist" containerID="084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.908479 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb"} err="failed to get container status \"084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb\": rpc error: code = NotFound desc = could not find container \"084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb\": container with ID starting with 084a40d2ee2e3cf5d4d43f06b652b3b2c2fea7d5c25fb917ab7dc0b6b90ccdcb not found: ID does not exist" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.908507 4788 scope.go:117] "RemoveContainer" containerID="9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca" Feb 19 09:21:36 crc kubenswrapper[4788]: E0219 09:21:36.908813 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca\": container with ID starting with 9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca not found: ID does not exist" containerID="9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.908851 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca"} err="failed to get container status \"9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca\": rpc error: code = NotFound desc = could not find container \"9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca\": container with ID starting with 9ed88afff90a936d2b3f4261461e979918034fd992fc21e32714a3b1052627ca not found: ID does not exist" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.908876 4788 scope.go:117] "RemoveContainer" containerID="e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07" Feb 19 09:21:36 crc kubenswrapper[4788]: E0219 09:21:36.909225 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07\": container with ID starting with e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07 not found: ID does not exist" containerID="e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07" Feb 19 09:21:36 crc kubenswrapper[4788]: I0219 09:21:36.909284 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07"} err="failed to get container status \"e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07\": rpc error: code = NotFound desc = could not find container \"e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07\": container with ID starting with e16b2a7fc2a0fbaf25e7ec29a4a02bb49ed24fc6680dcb4f4cce03358af16a07 not found: ID does not exist" Feb 19 09:21:38 crc kubenswrapper[4788]: I0219 09:21:38.747864 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915abe52-d3da-4b57-ac55-20b48462941f" path="/var/lib/kubelet/pods/915abe52-d3da-4b57-ac55-20b48462941f/volumes" Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.139503 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.140216 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.140313 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.141962 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.142094 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" gracePeriod=600 Feb 19 09:21:52 crc kubenswrapper[4788]: E0219 09:21:52.778066 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.955769 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" exitCode=0 Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.955835 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91"} Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.955906 4788 scope.go:117] "RemoveContainer" containerID="d2c63370c54186978ab16c8be5655d7593a49e64407ae314b67deb51ba19c912" Feb 19 09:21:52 crc kubenswrapper[4788]: I0219 09:21:52.956640 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:21:52 crc kubenswrapper[4788]: E0219 09:21:52.956950 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:22:04 crc kubenswrapper[4788]: I0219 09:22:04.714280 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:22:04 crc kubenswrapper[4788]: E0219 09:22:04.715285 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:22:18 crc kubenswrapper[4788]: I0219 09:22:18.721929 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:22:18 crc kubenswrapper[4788]: E0219 09:22:18.722951 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:22:29 crc kubenswrapper[4788]: I0219 09:22:29.715356 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:22:29 crc kubenswrapper[4788]: E0219 09:22:29.716599 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:22:41 crc kubenswrapper[4788]: I0219 09:22:41.714785 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:22:41 crc kubenswrapper[4788]: E0219 09:22:41.715699 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:22:53 crc kubenswrapper[4788]: I0219 09:22:53.715024 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:22:53 crc kubenswrapper[4788]: E0219 09:22:53.715853 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:23:06 crc kubenswrapper[4788]: I0219 09:23:06.715711 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:23:06 crc kubenswrapper[4788]: E0219 09:23:06.717016 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.794779 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6xjp"] Feb 19 09:23:08 crc kubenswrapper[4788]: E0219 09:23:08.796280 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915abe52-d3da-4b57-ac55-20b48462941f" containerName="extract-utilities" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.796364 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="915abe52-d3da-4b57-ac55-20b48462941f" containerName="extract-utilities" Feb 19 09:23:08 crc kubenswrapper[4788]: E0219 09:23:08.796424 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915abe52-d3da-4b57-ac55-20b48462941f" containerName="extract-content" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.796478 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="915abe52-d3da-4b57-ac55-20b48462941f" containerName="extract-content" Feb 19 09:23:08 crc kubenswrapper[4788]: E0219 09:23:08.796552 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915abe52-d3da-4b57-ac55-20b48462941f" containerName="registry-server" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.796610 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="915abe52-d3da-4b57-ac55-20b48462941f" containerName="registry-server" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.796854 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="915abe52-d3da-4b57-ac55-20b48462941f" containerName="registry-server" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.798109 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.816325 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6xjp"] Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.925147 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-catalog-content\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.925538 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stc8v\" (UniqueName: \"kubernetes.io/projected/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-kube-api-access-stc8v\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:08 crc kubenswrapper[4788]: I0219 09:23:08.925578 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-utilities\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.004388 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p4mnb"] Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.006864 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.013403 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4mnb"] Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.030724 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-catalog-content\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.030897 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stc8v\" (UniqueName: \"kubernetes.io/projected/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-kube-api-access-stc8v\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.030959 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-utilities\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.031823 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-catalog-content\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.032941 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-utilities\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.052769 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stc8v\" (UniqueName: \"kubernetes.io/projected/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-kube-api-access-stc8v\") pod \"redhat-marketplace-v6xjp\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.127870 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.133130 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-utilities\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.133278 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg6xl\" (UniqueName: \"kubernetes.io/projected/24f28bdb-20e7-4942-b441-7f8c01096e58-kube-api-access-vg6xl\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.133366 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-catalog-content\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.234732 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg6xl\" (UniqueName: \"kubernetes.io/projected/24f28bdb-20e7-4942-b441-7f8c01096e58-kube-api-access-vg6xl\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.234845 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-catalog-content\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.234899 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-utilities\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.235475 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-utilities\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.235479 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-catalog-content\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.258860 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg6xl\" (UniqueName: \"kubernetes.io/projected/24f28bdb-20e7-4942-b441-7f8c01096e58-kube-api-access-vg6xl\") pod \"redhat-operators-p4mnb\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.332684 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.684596 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6xjp"] Feb 19 09:23:09 crc kubenswrapper[4788]: W0219 09:23:09.685539 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd95bbd01_ebd1_46b6_9b09_7b60f18980b1.slice/crio-4220e7d4d09250cfe51d3efa975060961519b150f8212f6df4de9a698a21dfd2 WatchSource:0}: Error finding container 4220e7d4d09250cfe51d3efa975060961519b150f8212f6df4de9a698a21dfd2: Status 404 returned error can't find the container with id 4220e7d4d09250cfe51d3efa975060961519b150f8212f6df4de9a698a21dfd2 Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.919862 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6xjp" event={"ID":"d95bbd01-ebd1-46b6-9b09-7b60f18980b1","Type":"ContainerStarted","Data":"4220e7d4d09250cfe51d3efa975060961519b150f8212f6df4de9a698a21dfd2"} Feb 19 09:23:09 crc kubenswrapper[4788]: I0219 09:23:09.958102 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4mnb"] Feb 19 09:23:09 crc kubenswrapper[4788]: W0219 09:23:09.959900 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24f28bdb_20e7_4942_b441_7f8c01096e58.slice/crio-a6d13b03946a34748f99c4fbb112545eca804c4051fcede082438446944e4991 WatchSource:0}: Error finding container a6d13b03946a34748f99c4fbb112545eca804c4051fcede082438446944e4991: Status 404 returned error can't find the container with id a6d13b03946a34748f99c4fbb112545eca804c4051fcede082438446944e4991 Feb 19 09:23:10 crc kubenswrapper[4788]: I0219 09:23:10.928554 4788 generic.go:334] "Generic (PLEG): container finished" podID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerID="1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30" exitCode=0 Feb 19 09:23:10 crc kubenswrapper[4788]: I0219 09:23:10.928631 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6xjp" event={"ID":"d95bbd01-ebd1-46b6-9b09-7b60f18980b1","Type":"ContainerDied","Data":"1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30"} Feb 19 09:23:10 crc kubenswrapper[4788]: I0219 09:23:10.935284 4788 generic.go:334] "Generic (PLEG): container finished" podID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerID="9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d" exitCode=0 Feb 19 09:23:10 crc kubenswrapper[4788]: I0219 09:23:10.935338 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mnb" event={"ID":"24f28bdb-20e7-4942-b441-7f8c01096e58","Type":"ContainerDied","Data":"9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d"} Feb 19 09:23:10 crc kubenswrapper[4788]: I0219 09:23:10.935368 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mnb" event={"ID":"24f28bdb-20e7-4942-b441-7f8c01096e58","Type":"ContainerStarted","Data":"a6d13b03946a34748f99c4fbb112545eca804c4051fcede082438446944e4991"} Feb 19 09:23:14 crc kubenswrapper[4788]: I0219 09:23:14.993576 4788 generic.go:334] "Generic (PLEG): container finished" podID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerID="7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589" exitCode=0 Feb 19 09:23:14 crc kubenswrapper[4788]: I0219 09:23:14.993820 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6xjp" event={"ID":"d95bbd01-ebd1-46b6-9b09-7b60f18980b1","Type":"ContainerDied","Data":"7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589"} Feb 19 09:23:14 crc kubenswrapper[4788]: I0219 09:23:14.996743 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mnb" event={"ID":"24f28bdb-20e7-4942-b441-7f8c01096e58","Type":"ContainerStarted","Data":"4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27"} Feb 19 09:23:16 crc kubenswrapper[4788]: I0219 09:23:16.010993 4788 generic.go:334] "Generic (PLEG): container finished" podID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerID="4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27" exitCode=0 Feb 19 09:23:16 crc kubenswrapper[4788]: I0219 09:23:16.011054 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mnb" event={"ID":"24f28bdb-20e7-4942-b441-7f8c01096e58","Type":"ContainerDied","Data":"4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27"} Feb 19 09:23:18 crc kubenswrapper[4788]: I0219 09:23:18.035491 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6xjp" event={"ID":"d95bbd01-ebd1-46b6-9b09-7b60f18980b1","Type":"ContainerStarted","Data":"f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425"} Feb 19 09:23:18 crc kubenswrapper[4788]: I0219 09:23:18.055561 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6xjp" podStartSLOduration=3.450648533 podStartE2EDuration="10.05554328s" podCreationTimestamp="2026-02-19 09:23:08 +0000 UTC" firstStartedPulling="2026-02-19 09:23:10.932405783 +0000 UTC m=+2292.920417295" lastFinishedPulling="2026-02-19 09:23:17.53730057 +0000 UTC m=+2299.525312042" observedRunningTime="2026-02-19 09:23:18.053517378 +0000 UTC m=+2300.041528860" watchObservedRunningTime="2026-02-19 09:23:18.05554328 +0000 UTC m=+2300.043554762" Feb 19 09:23:18 crc kubenswrapper[4788]: I0219 09:23:18.729608 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:23:18 crc kubenswrapper[4788]: E0219 09:23:18.730508 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:23:19 crc kubenswrapper[4788]: I0219 09:23:19.046896 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mnb" event={"ID":"24f28bdb-20e7-4942-b441-7f8c01096e58","Type":"ContainerStarted","Data":"27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa"} Feb 19 09:23:19 crc kubenswrapper[4788]: I0219 09:23:19.079505 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p4mnb" podStartSLOduration=3.88126467 podStartE2EDuration="11.07947893s" podCreationTimestamp="2026-02-19 09:23:08 +0000 UTC" firstStartedPulling="2026-02-19 09:23:10.937880383 +0000 UTC m=+2292.925891855" lastFinishedPulling="2026-02-19 09:23:18.136094643 +0000 UTC m=+2300.124106115" observedRunningTime="2026-02-19 09:23:19.074004891 +0000 UTC m=+2301.062016373" watchObservedRunningTime="2026-02-19 09:23:19.07947893 +0000 UTC m=+2301.067490422" Feb 19 09:23:19 crc kubenswrapper[4788]: I0219 09:23:19.129023 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:19 crc kubenswrapper[4788]: I0219 09:23:19.129085 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:19 crc kubenswrapper[4788]: I0219 09:23:19.334068 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:19 crc kubenswrapper[4788]: I0219 09:23:19.334123 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:20 crc kubenswrapper[4788]: I0219 09:23:20.188729 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-v6xjp" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="registry-server" probeResult="failure" output=< Feb 19 09:23:20 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:23:20 crc kubenswrapper[4788]: > Feb 19 09:23:20 crc kubenswrapper[4788]: I0219 09:23:20.398477 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p4mnb" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="registry-server" probeResult="failure" output=< Feb 19 09:23:20 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:23:20 crc kubenswrapper[4788]: > Feb 19 09:23:29 crc kubenswrapper[4788]: I0219 09:23:29.211476 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:29 crc kubenswrapper[4788]: I0219 09:23:29.285316 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:29 crc kubenswrapper[4788]: I0219 09:23:29.399101 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:29 crc kubenswrapper[4788]: I0219 09:23:29.462519 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:29 crc kubenswrapper[4788]: I0219 09:23:29.474715 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6xjp"] Feb 19 09:23:31 crc kubenswrapper[4788]: I0219 09:23:31.170012 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v6xjp" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="registry-server" containerID="cri-o://f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425" gracePeriod=2 Feb 19 09:23:31 crc kubenswrapper[4788]: I0219 09:23:31.668842 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4mnb"] Feb 19 09:23:31 crc kubenswrapper[4788]: I0219 09:23:31.669152 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p4mnb" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="registry-server" containerID="cri-o://27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa" gracePeriod=2 Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.147316 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.156608 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.183258 4788 generic.go:334] "Generic (PLEG): container finished" podID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerID="27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa" exitCode=0 Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.183321 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4mnb" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.183344 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mnb" event={"ID":"24f28bdb-20e7-4942-b441-7f8c01096e58","Type":"ContainerDied","Data":"27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa"} Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.183392 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4mnb" event={"ID":"24f28bdb-20e7-4942-b441-7f8c01096e58","Type":"ContainerDied","Data":"a6d13b03946a34748f99c4fbb112545eca804c4051fcede082438446944e4991"} Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.183412 4788 scope.go:117] "RemoveContainer" containerID="27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.187296 4788 generic.go:334] "Generic (PLEG): container finished" podID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerID="f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425" exitCode=0 Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.187349 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6xjp" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.187357 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6xjp" event={"ID":"d95bbd01-ebd1-46b6-9b09-7b60f18980b1","Type":"ContainerDied","Data":"f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425"} Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.187732 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6xjp" event={"ID":"d95bbd01-ebd1-46b6-9b09-7b60f18980b1","Type":"ContainerDied","Data":"4220e7d4d09250cfe51d3efa975060961519b150f8212f6df4de9a698a21dfd2"} Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.212262 4788 scope.go:117] "RemoveContainer" containerID="4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.233463 4788 scope.go:117] "RemoveContainer" containerID="9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.253052 4788 scope.go:117] "RemoveContainer" containerID="27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa" Feb 19 09:23:32 crc kubenswrapper[4788]: E0219 09:23:32.253749 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa\": container with ID starting with 27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa not found: ID does not exist" containerID="27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.253787 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa"} err="failed to get container status \"27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa\": rpc error: code = NotFound desc = could not find container \"27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa\": container with ID starting with 27b5f777364bd73e285cd52aa3d8350cb9528eeaaee0b1ad36cf86fd2d02ebaa not found: ID does not exist" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.253813 4788 scope.go:117] "RemoveContainer" containerID="4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27" Feb 19 09:23:32 crc kubenswrapper[4788]: E0219 09:23:32.254036 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27\": container with ID starting with 4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27 not found: ID does not exist" containerID="4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.254201 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27"} err="failed to get container status \"4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27\": rpc error: code = NotFound desc = could not find container \"4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27\": container with ID starting with 4a3365219215518126e694299470ff5d6151dd51bc4d258336978a2a173bac27 not found: ID does not exist" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.254356 4788 scope.go:117] "RemoveContainer" containerID="9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d" Feb 19 09:23:32 crc kubenswrapper[4788]: E0219 09:23:32.255810 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d\": container with ID starting with 9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d not found: ID does not exist" containerID="9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.255954 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d"} err="failed to get container status \"9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d\": rpc error: code = NotFound desc = could not find container \"9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d\": container with ID starting with 9d6298d596155a188370b06054bf755c6b14771c151b9a0eac4a891e7c43a50d not found: ID does not exist" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.256051 4788 scope.go:117] "RemoveContainer" containerID="f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.262870 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stc8v\" (UniqueName: \"kubernetes.io/projected/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-kube-api-access-stc8v\") pod \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.262968 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-catalog-content\") pod \"24f28bdb-20e7-4942-b441-7f8c01096e58\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.263089 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg6xl\" (UniqueName: \"kubernetes.io/projected/24f28bdb-20e7-4942-b441-7f8c01096e58-kube-api-access-vg6xl\") pod \"24f28bdb-20e7-4942-b441-7f8c01096e58\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.263160 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-utilities\") pod \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.263183 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-utilities\") pod \"24f28bdb-20e7-4942-b441-7f8c01096e58\" (UID: \"24f28bdb-20e7-4942-b441-7f8c01096e58\") " Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.263236 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-catalog-content\") pod \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\" (UID: \"d95bbd01-ebd1-46b6-9b09-7b60f18980b1\") " Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.264549 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-utilities" (OuterVolumeSpecName: "utilities") pod "24f28bdb-20e7-4942-b441-7f8c01096e58" (UID: "24f28bdb-20e7-4942-b441-7f8c01096e58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.264684 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-utilities" (OuterVolumeSpecName: "utilities") pod "d95bbd01-ebd1-46b6-9b09-7b60f18980b1" (UID: "d95bbd01-ebd1-46b6-9b09-7b60f18980b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.269629 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-kube-api-access-stc8v" (OuterVolumeSpecName: "kube-api-access-stc8v") pod "d95bbd01-ebd1-46b6-9b09-7b60f18980b1" (UID: "d95bbd01-ebd1-46b6-9b09-7b60f18980b1"). InnerVolumeSpecName "kube-api-access-stc8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.270331 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f28bdb-20e7-4942-b441-7f8c01096e58-kube-api-access-vg6xl" (OuterVolumeSpecName: "kube-api-access-vg6xl") pod "24f28bdb-20e7-4942-b441-7f8c01096e58" (UID: "24f28bdb-20e7-4942-b441-7f8c01096e58"). InnerVolumeSpecName "kube-api-access-vg6xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.275266 4788 scope.go:117] "RemoveContainer" containerID="7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.291903 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d95bbd01-ebd1-46b6-9b09-7b60f18980b1" (UID: "d95bbd01-ebd1-46b6-9b09-7b60f18980b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.346063 4788 scope.go:117] "RemoveContainer" containerID="1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.366439 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.366475 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.366488 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.366521 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stc8v\" (UniqueName: \"kubernetes.io/projected/d95bbd01-ebd1-46b6-9b09-7b60f18980b1-kube-api-access-stc8v\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.366538 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg6xl\" (UniqueName: \"kubernetes.io/projected/24f28bdb-20e7-4942-b441-7f8c01096e58-kube-api-access-vg6xl\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.386723 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24f28bdb-20e7-4942-b441-7f8c01096e58" (UID: "24f28bdb-20e7-4942-b441-7f8c01096e58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.390392 4788 scope.go:117] "RemoveContainer" containerID="f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425" Feb 19 09:23:32 crc kubenswrapper[4788]: E0219 09:23:32.392409 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425\": container with ID starting with f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425 not found: ID does not exist" containerID="f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.392487 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425"} err="failed to get container status \"f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425\": rpc error: code = NotFound desc = could not find container \"f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425\": container with ID starting with f6316dc3bb6574e58c27b4c7f2c1a6e2148441ae0c01428c0103a408fd0eb425 not found: ID does not exist" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.392547 4788 scope.go:117] "RemoveContainer" containerID="7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589" Feb 19 09:23:32 crc kubenswrapper[4788]: E0219 09:23:32.393012 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589\": container with ID starting with 7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589 not found: ID does not exist" containerID="7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.393086 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589"} err="failed to get container status \"7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589\": rpc error: code = NotFound desc = could not find container \"7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589\": container with ID starting with 7d2c6706e1e083debbe20c50f5f3d90f03fa3ad03c18c64228062d6159983589 not found: ID does not exist" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.393122 4788 scope.go:117] "RemoveContainer" containerID="1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30" Feb 19 09:23:32 crc kubenswrapper[4788]: E0219 09:23:32.395842 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30\": container with ID starting with 1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30 not found: ID does not exist" containerID="1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.395876 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30"} err="failed to get container status \"1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30\": rpc error: code = NotFound desc = could not find container \"1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30\": container with ID starting with 1363ca8ea618b47fd1f671c23dac50a742f2222090dd626698f8f334c7c1ec30 not found: ID does not exist" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.468026 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24f28bdb-20e7-4942-b441-7f8c01096e58-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.534952 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4mnb"] Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.545138 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p4mnb"] Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.552403 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6xjp"] Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.560616 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6xjp"] Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.715480 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:23:32 crc kubenswrapper[4788]: E0219 09:23:32.715826 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.724959 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" path="/var/lib/kubelet/pods/24f28bdb-20e7-4942-b441-7f8c01096e58/volumes" Feb 19 09:23:32 crc kubenswrapper[4788]: I0219 09:23:32.725863 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" path="/var/lib/kubelet/pods/d95bbd01-ebd1-46b6-9b09-7b60f18980b1/volumes" Feb 19 09:23:44 crc kubenswrapper[4788]: I0219 09:23:44.714889 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:23:44 crc kubenswrapper[4788]: E0219 09:23:44.716119 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:23:49 crc kubenswrapper[4788]: I0219 09:23:49.360258 4788 generic.go:334] "Generic (PLEG): container finished" podID="015b4a74-0341-4e84-862c-d627e79f1318" containerID="af38f66766abe4723ba3de3d20cb774f00e35bb0097262d58f23bfbc0cc8692d" exitCode=0 Feb 19 09:23:49 crc kubenswrapper[4788]: I0219 09:23:49.360303 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" event={"ID":"015b4a74-0341-4e84-862c-d627e79f1318","Type":"ContainerDied","Data":"af38f66766abe4723ba3de3d20cb774f00e35bb0097262d58f23bfbc0cc8692d"} Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.862630 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965119 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-0\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965206 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9blnz\" (UniqueName: \"kubernetes.io/projected/015b4a74-0341-4e84-862c-d627e79f1318-kube-api-access-9blnz\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965293 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-combined-ca-bundle\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965330 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-3\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965483 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-inventory\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965526 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-0\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965569 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-1\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965619 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-2\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965648 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-1\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965669 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/015b4a74-0341-4e84-862c-d627e79f1318-nova-extra-config-0\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.965719 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-ssh-key-openstack-edpm-ipam\") pod \"015b4a74-0341-4e84-862c-d627e79f1318\" (UID: \"015b4a74-0341-4e84-862c-d627e79f1318\") " Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.972455 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.975093 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015b4a74-0341-4e84-862c-d627e79f1318-kube-api-access-9blnz" (OuterVolumeSpecName: "kube-api-access-9blnz") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "kube-api-access-9blnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:23:50 crc kubenswrapper[4788]: I0219 09:23:50.992871 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.000099 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.000903 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.003524 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-inventory" (OuterVolumeSpecName: "inventory") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.004507 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/015b4a74-0341-4e84-862c-d627e79f1318-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.007519 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.007811 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.016826 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.021215 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "015b4a74-0341-4e84-862c-d627e79f1318" (UID: "015b4a74-0341-4e84-862c-d627e79f1318"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.068505 4788 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.068808 4788 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/015b4a74-0341-4e84-862c-d627e79f1318-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.068915 4788 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.069010 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.069097 4788 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.069173 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9blnz\" (UniqueName: \"kubernetes.io/projected/015b4a74-0341-4e84-862c-d627e79f1318-kube-api-access-9blnz\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.069269 4788 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.069349 4788 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.069439 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.069521 4788 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.069600 4788 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/015b4a74-0341-4e84-862c-d627e79f1318-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.385996 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" event={"ID":"015b4a74-0341-4e84-862c-d627e79f1318","Type":"ContainerDied","Data":"b39f9cf86605e5050d9edd274c4d061e5a4275559e3c59639dd6d020dc68719b"} Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.386044 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39f9cf86605e5050d9edd274c4d061e5a4275559e3c59639dd6d020dc68719b" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.386106 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-l5jz4" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503273 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr"] Feb 19 09:23:51 crc kubenswrapper[4788]: E0219 09:23:51.503666 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="extract-content" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503689 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="extract-content" Feb 19 09:23:51 crc kubenswrapper[4788]: E0219 09:23:51.503701 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="extract-content" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503707 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="extract-content" Feb 19 09:23:51 crc kubenswrapper[4788]: E0219 09:23:51.503723 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="registry-server" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503730 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="registry-server" Feb 19 09:23:51 crc kubenswrapper[4788]: E0219 09:23:51.503743 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="extract-utilities" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503750 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="extract-utilities" Feb 19 09:23:51 crc kubenswrapper[4788]: E0219 09:23:51.503773 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="extract-utilities" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503780 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="extract-utilities" Feb 19 09:23:51 crc kubenswrapper[4788]: E0219 09:23:51.503791 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="registry-server" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503796 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="registry-server" Feb 19 09:23:51 crc kubenswrapper[4788]: E0219 09:23:51.503811 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015b4a74-0341-4e84-862c-d627e79f1318" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503817 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="015b4a74-0341-4e84-862c-d627e79f1318" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.503998 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f28bdb-20e7-4942-b441-7f8c01096e58" containerName="registry-server" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.504013 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95bbd01-ebd1-46b6-9b09-7b60f18980b1" containerName="registry-server" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.504024 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="015b4a74-0341-4e84-862c-d627e79f1318" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.504631 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.507814 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.507819 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7mxrm" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.508089 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.509678 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.509707 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.516119 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr"] Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.683033 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.683278 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.683413 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.684149 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.684216 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.684423 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvln2\" (UniqueName: \"kubernetes.io/projected/030858ba-0181-4da7-afa6-ec5fb6cefc0f-kube-api-access-tvln2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.684702 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.788153 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.788315 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.788400 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvln2\" (UniqueName: \"kubernetes.io/projected/030858ba-0181-4da7-afa6-ec5fb6cefc0f-kube-api-access-tvln2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.788538 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.788574 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.788617 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.788658 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.794924 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.795173 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.795449 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.796434 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.796557 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.803030 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.807569 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvln2\" (UniqueName: \"kubernetes.io/projected/030858ba-0181-4da7-afa6-ec5fb6cefc0f-kube-api-access-tvln2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:51 crc kubenswrapper[4788]: I0219 09:23:51.823539 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:23:52 crc kubenswrapper[4788]: I0219 09:23:52.210494 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr"] Feb 19 09:23:52 crc kubenswrapper[4788]: I0219 09:23:52.399984 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" event={"ID":"030858ba-0181-4da7-afa6-ec5fb6cefc0f","Type":"ContainerStarted","Data":"bd31bb90ff3a8479b499795046a76553f7af530bd74965ac7283ee3350b62388"} Feb 19 09:23:53 crc kubenswrapper[4788]: I0219 09:23:53.407993 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" event={"ID":"030858ba-0181-4da7-afa6-ec5fb6cefc0f","Type":"ContainerStarted","Data":"479f53d91e53439c39cb2f2fc01afa5f644f9f92e2786b6f8efae696c1ed4354"} Feb 19 09:23:53 crc kubenswrapper[4788]: I0219 09:23:53.433348 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" podStartSLOduration=1.6246732860000002 podStartE2EDuration="2.433326343s" podCreationTimestamp="2026-02-19 09:23:51 +0000 UTC" firstStartedPulling="2026-02-19 09:23:52.21811243 +0000 UTC m=+2334.206123922" lastFinishedPulling="2026-02-19 09:23:53.026765507 +0000 UTC m=+2335.014776979" observedRunningTime="2026-02-19 09:23:53.423815912 +0000 UTC m=+2335.411827394" watchObservedRunningTime="2026-02-19 09:23:53.433326343 +0000 UTC m=+2335.421337825" Feb 19 09:23:59 crc kubenswrapper[4788]: I0219 09:23:59.715175 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:23:59 crc kubenswrapper[4788]: E0219 09:23:59.716131 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:24:12 crc kubenswrapper[4788]: I0219 09:24:12.715138 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:24:12 crc kubenswrapper[4788]: E0219 09:24:12.716232 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:24:27 crc kubenswrapper[4788]: I0219 09:24:27.714772 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:24:27 crc kubenswrapper[4788]: E0219 09:24:27.715546 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:24:39 crc kubenswrapper[4788]: I0219 09:24:39.713734 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:24:39 crc kubenswrapper[4788]: E0219 09:24:39.714444 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:24:52 crc kubenswrapper[4788]: I0219 09:24:52.714723 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:24:52 crc kubenswrapper[4788]: E0219 09:24:52.715516 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:25:05 crc kubenswrapper[4788]: I0219 09:25:05.714074 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:25:05 crc kubenswrapper[4788]: E0219 09:25:05.714935 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:25:16 crc kubenswrapper[4788]: I0219 09:25:16.714677 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:25:16 crc kubenswrapper[4788]: E0219 09:25:16.716799 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:25:27 crc kubenswrapper[4788]: I0219 09:25:27.714467 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:25:27 crc kubenswrapper[4788]: E0219 09:25:27.715235 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:25:39 crc kubenswrapper[4788]: I0219 09:25:39.715008 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:25:39 crc kubenswrapper[4788]: E0219 09:25:39.715896 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:25:51 crc kubenswrapper[4788]: I0219 09:25:51.714749 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:25:51 crc kubenswrapper[4788]: E0219 09:25:51.715496 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:26:05 crc kubenswrapper[4788]: I0219 09:26:05.714348 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:26:05 crc kubenswrapper[4788]: E0219 09:26:05.715030 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:26:09 crc kubenswrapper[4788]: I0219 09:26:09.644768 4788 generic.go:334] "Generic (PLEG): container finished" podID="030858ba-0181-4da7-afa6-ec5fb6cefc0f" containerID="479f53d91e53439c39cb2f2fc01afa5f644f9f92e2786b6f8efae696c1ed4354" exitCode=0 Feb 19 09:26:09 crc kubenswrapper[4788]: I0219 09:26:09.644888 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" event={"ID":"030858ba-0181-4da7-afa6-ec5fb6cefc0f","Type":"ContainerDied","Data":"479f53d91e53439c39cb2f2fc01afa5f644f9f92e2786b6f8efae696c1ed4354"} Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.072036 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.133896 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-1\") pod \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.133967 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-inventory\") pod \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.134010 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvln2\" (UniqueName: \"kubernetes.io/projected/030858ba-0181-4da7-afa6-ec5fb6cefc0f-kube-api-access-tvln2\") pod \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.134052 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-0\") pod \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.134076 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ssh-key-openstack-edpm-ipam\") pod \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.134111 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-telemetry-combined-ca-bundle\") pod \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.134141 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-2\") pod \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\" (UID: \"030858ba-0181-4da7-afa6-ec5fb6cefc0f\") " Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.148554 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030858ba-0181-4da7-afa6-ec5fb6cefc0f-kube-api-access-tvln2" (OuterVolumeSpecName: "kube-api-access-tvln2") pod "030858ba-0181-4da7-afa6-ec5fb6cefc0f" (UID: "030858ba-0181-4da7-afa6-ec5fb6cefc0f"). InnerVolumeSpecName "kube-api-access-tvln2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.148553 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "030858ba-0181-4da7-afa6-ec5fb6cefc0f" (UID: "030858ba-0181-4da7-afa6-ec5fb6cefc0f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.162754 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "030858ba-0181-4da7-afa6-ec5fb6cefc0f" (UID: "030858ba-0181-4da7-afa6-ec5fb6cefc0f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.163515 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-inventory" (OuterVolumeSpecName: "inventory") pod "030858ba-0181-4da7-afa6-ec5fb6cefc0f" (UID: "030858ba-0181-4da7-afa6-ec5fb6cefc0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.163832 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "030858ba-0181-4da7-afa6-ec5fb6cefc0f" (UID: "030858ba-0181-4da7-afa6-ec5fb6cefc0f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.168456 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "030858ba-0181-4da7-afa6-ec5fb6cefc0f" (UID: "030858ba-0181-4da7-afa6-ec5fb6cefc0f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.174571 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "030858ba-0181-4da7-afa6-ec5fb6cefc0f" (UID: "030858ba-0181-4da7-afa6-ec5fb6cefc0f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.235510 4788 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.235546 4788 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-inventory\") on node \"crc\" DevicePath \"\"" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.235557 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvln2\" (UniqueName: \"kubernetes.io/projected/030858ba-0181-4da7-afa6-ec5fb6cefc0f-kube-api-access-tvln2\") on node \"crc\" DevicePath \"\"" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.235567 4788 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.235577 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.235587 4788 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.235597 4788 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/030858ba-0181-4da7-afa6-ec5fb6cefc0f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.670280 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" event={"ID":"030858ba-0181-4da7-afa6-ec5fb6cefc0f","Type":"ContainerDied","Data":"bd31bb90ff3a8479b499795046a76553f7af530bd74965ac7283ee3350b62388"} Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.670339 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd31bb90ff3a8479b499795046a76553f7af530bd74965ac7283ee3350b62388" Feb 19 09:26:11 crc kubenswrapper[4788]: I0219 09:26:11.670434 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr" Feb 19 09:26:11 crc kubenswrapper[4788]: E0219 09:26:11.885498 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod030858ba_0181_4da7_afa6_ec5fb6cefc0f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod030858ba_0181_4da7_afa6_ec5fb6cefc0f.slice/crio-bd31bb90ff3a8479b499795046a76553f7af530bd74965ac7283ee3350b62388\": RecentStats: unable to find data in memory cache]" Feb 19 09:26:17 crc kubenswrapper[4788]: I0219 09:26:17.715927 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:26:17 crc kubenswrapper[4788]: E0219 09:26:17.717215 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:26:30 crc kubenswrapper[4788]: I0219 09:26:30.715083 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:26:30 crc kubenswrapper[4788]: E0219 09:26:30.715628 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:26:41 crc kubenswrapper[4788]: I0219 09:26:41.714793 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:26:41 crc kubenswrapper[4788]: E0219 09:26:41.715623 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:26:53 crc kubenswrapper[4788]: I0219 09:26:53.715020 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:26:54 crc kubenswrapper[4788]: I0219 09:26:54.319276 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"7c0c08df672c69038ccb3b11abd1c4a577ae594aa382ece99e7a2cf62ee3717b"} Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.809428 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 09:27:09 crc kubenswrapper[4788]: E0219 09:27:09.810331 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030858ba-0181-4da7-afa6-ec5fb6cefc0f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.810351 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="030858ba-0181-4da7-afa6-ec5fb6cefc0f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.810586 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="030858ba-0181-4da7-afa6-ec5fb6cefc0f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.811313 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.813557 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.814055 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.814530 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tfph6" Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.814697 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 19 09:27:09 crc kubenswrapper[4788]: I0219 09:27:09.818597 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011517 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011567 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011614 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011638 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011691 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-config-data\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011707 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011744 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011760 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.011789 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cn6b\" (UniqueName: \"kubernetes.io/projected/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-kube-api-access-4cn6b\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113104 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113166 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113231 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113280 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113331 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-config-data\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113356 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113391 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113407 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113431 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.113470 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cn6b\" (UniqueName: \"kubernetes.io/projected/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-kube-api-access-4cn6b\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.114638 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.115592 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-config-data\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.115968 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.116119 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.119743 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.119919 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.124864 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.140028 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.152884 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cn6b\" (UniqueName: \"kubernetes.io/projected/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-kube-api-access-4cn6b\") pod \"tempest-tests-tempest\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " pod="openstack/tempest-tests-tempest" Feb 19 09:27:10 crc kubenswrapper[4788]: I0219 09:27:10.441076 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 09:27:11 crc kubenswrapper[4788]: I0219 09:27:11.013556 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 19 09:27:11 crc kubenswrapper[4788]: I0219 09:27:11.019585 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:27:11 crc kubenswrapper[4788]: I0219 09:27:11.507113 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59d3eda3-8975-46b8-8cfa-27b4dcd210f7","Type":"ContainerStarted","Data":"89cb4b5ba55c51485dd6e66de4f1282fc9be45293d42aedca5db3fd5fb643797"} Feb 19 09:27:41 crc kubenswrapper[4788]: E0219 09:27:41.937854 4788 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 19 09:27:41 crc kubenswrapper[4788]: E0219 09:27:41.938751 4788 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4cn6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(59d3eda3-8975-46b8-8cfa-27b4dcd210f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 19 09:27:41 crc kubenswrapper[4788]: E0219 09:27:41.939963 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="59d3eda3-8975-46b8-8cfa-27b4dcd210f7" Feb 19 09:27:42 crc kubenswrapper[4788]: E0219 09:27:42.794701 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="59d3eda3-8975-46b8-8cfa-27b4dcd210f7" Feb 19 09:27:58 crc kubenswrapper[4788]: I0219 09:27:58.133720 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59d3eda3-8975-46b8-8cfa-27b4dcd210f7","Type":"ContainerStarted","Data":"39eacaf87e7b12eaaed6b022b02900c466a885730b35fc3997ca205c18c06627"} Feb 19 09:27:58 crc kubenswrapper[4788]: I0219 09:27:58.151673 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.715012208 podStartE2EDuration="50.151656123s" podCreationTimestamp="2026-02-19 09:27:08 +0000 UTC" firstStartedPulling="2026-02-19 09:27:11.019282869 +0000 UTC m=+2533.007294361" lastFinishedPulling="2026-02-19 09:27:56.455926794 +0000 UTC m=+2578.443938276" observedRunningTime="2026-02-19 09:27:58.150013996 +0000 UTC m=+2580.138025468" watchObservedRunningTime="2026-02-19 09:27:58.151656123 +0000 UTC m=+2580.139667595" Feb 19 09:29:22 crc kubenswrapper[4788]: I0219 09:29:22.139872 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:29:22 crc kubenswrapper[4788]: I0219 09:29:22.140682 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:29:52 crc kubenswrapper[4788]: I0219 09:29:52.139596 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:29:52 crc kubenswrapper[4788]: I0219 09:29:52.140195 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.142656 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5"] Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.147850 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.152231 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.152435 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.181141 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5"] Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.220078 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71fd4bc0-24c4-4c75-8fee-932761e910c2-config-volume\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.220211 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71fd4bc0-24c4-4c75-8fee-932761e910c2-secret-volume\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.220388 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfklr\" (UniqueName: \"kubernetes.io/projected/71fd4bc0-24c4-4c75-8fee-932761e910c2-kube-api-access-vfklr\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.321997 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71fd4bc0-24c4-4c75-8fee-932761e910c2-secret-volume\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.322353 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfklr\" (UniqueName: \"kubernetes.io/projected/71fd4bc0-24c4-4c75-8fee-932761e910c2-kube-api-access-vfklr\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.322494 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71fd4bc0-24c4-4c75-8fee-932761e910c2-config-volume\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.323470 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71fd4bc0-24c4-4c75-8fee-932761e910c2-config-volume\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.328046 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71fd4bc0-24c4-4c75-8fee-932761e910c2-secret-volume\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.337193 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfklr\" (UniqueName: \"kubernetes.io/projected/71fd4bc0-24c4-4c75-8fee-932761e910c2-kube-api-access-vfklr\") pod \"collect-profiles-29524890-ctmf5\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:00 crc kubenswrapper[4788]: I0219 09:30:00.468914 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:01 crc kubenswrapper[4788]: W0219 09:30:01.011215 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71fd4bc0_24c4_4c75_8fee_932761e910c2.slice/crio-2fea0e08efb5d4f67226eea5fc31bea8b78949e0baf02ddb6a6265094f110958 WatchSource:0}: Error finding container 2fea0e08efb5d4f67226eea5fc31bea8b78949e0baf02ddb6a6265094f110958: Status 404 returned error can't find the container with id 2fea0e08efb5d4f67226eea5fc31bea8b78949e0baf02ddb6a6265094f110958 Feb 19 09:30:01 crc kubenswrapper[4788]: I0219 09:30:01.031237 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5"] Feb 19 09:30:01 crc kubenswrapper[4788]: I0219 09:30:01.998059 4788 generic.go:334] "Generic (PLEG): container finished" podID="71fd4bc0-24c4-4c75-8fee-932761e910c2" containerID="73fbdfca0ca5303a0b06df619d07fff0c255525e705c64f4afcb79a99c1e3b47" exitCode=0 Feb 19 09:30:01 crc kubenswrapper[4788]: I0219 09:30:01.998226 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" event={"ID":"71fd4bc0-24c4-4c75-8fee-932761e910c2","Type":"ContainerDied","Data":"73fbdfca0ca5303a0b06df619d07fff0c255525e705c64f4afcb79a99c1e3b47"} Feb 19 09:30:01 crc kubenswrapper[4788]: I0219 09:30:01.998386 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" event={"ID":"71fd4bc0-24c4-4c75-8fee-932761e910c2","Type":"ContainerStarted","Data":"2fea0e08efb5d4f67226eea5fc31bea8b78949e0baf02ddb6a6265094f110958"} Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.600202 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.706092 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71fd4bc0-24c4-4c75-8fee-932761e910c2-secret-volume\") pod \"71fd4bc0-24c4-4c75-8fee-932761e910c2\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.707256 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71fd4bc0-24c4-4c75-8fee-932761e910c2-config-volume\") pod \"71fd4bc0-24c4-4c75-8fee-932761e910c2\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.707299 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfklr\" (UniqueName: \"kubernetes.io/projected/71fd4bc0-24c4-4c75-8fee-932761e910c2-kube-api-access-vfklr\") pod \"71fd4bc0-24c4-4c75-8fee-932761e910c2\" (UID: \"71fd4bc0-24c4-4c75-8fee-932761e910c2\") " Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.708311 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71fd4bc0-24c4-4c75-8fee-932761e910c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "71fd4bc0-24c4-4c75-8fee-932761e910c2" (UID: "71fd4bc0-24c4-4c75-8fee-932761e910c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.714366 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71fd4bc0-24c4-4c75-8fee-932761e910c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "71fd4bc0-24c4-4c75-8fee-932761e910c2" (UID: "71fd4bc0-24c4-4c75-8fee-932761e910c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.714413 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71fd4bc0-24c4-4c75-8fee-932761e910c2-kube-api-access-vfklr" (OuterVolumeSpecName: "kube-api-access-vfklr") pod "71fd4bc0-24c4-4c75-8fee-932761e910c2" (UID: "71fd4bc0-24c4-4c75-8fee-932761e910c2"). InnerVolumeSpecName "kube-api-access-vfklr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.809472 4788 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/71fd4bc0-24c4-4c75-8fee-932761e910c2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.809514 4788 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/71fd4bc0-24c4-4c75-8fee-932761e910c2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:30:03 crc kubenswrapper[4788]: I0219 09:30:03.809527 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfklr\" (UniqueName: \"kubernetes.io/projected/71fd4bc0-24c4-4c75-8fee-932761e910c2-kube-api-access-vfklr\") on node \"crc\" DevicePath \"\"" Feb 19 09:30:04 crc kubenswrapper[4788]: I0219 09:30:04.017034 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" event={"ID":"71fd4bc0-24c4-4c75-8fee-932761e910c2","Type":"ContainerDied","Data":"2fea0e08efb5d4f67226eea5fc31bea8b78949e0baf02ddb6a6265094f110958"} Feb 19 09:30:04 crc kubenswrapper[4788]: I0219 09:30:04.017327 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fea0e08efb5d4f67226eea5fc31bea8b78949e0baf02ddb6a6265094f110958" Feb 19 09:30:04 crc kubenswrapper[4788]: I0219 09:30:04.017388 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524890-ctmf5" Feb 19 09:30:04 crc kubenswrapper[4788]: I0219 09:30:04.679956 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq"] Feb 19 09:30:04 crc kubenswrapper[4788]: I0219 09:30:04.690756 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524845-8nftq"] Feb 19 09:30:04 crc kubenswrapper[4788]: I0219 09:30:04.731632 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6826dee1-4dec-4b7c-88a1-600eb014574c" path="/var/lib/kubelet/pods/6826dee1-4dec-4b7c-88a1-600eb014574c/volumes" Feb 19 09:30:22 crc kubenswrapper[4788]: I0219 09:30:22.139175 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:30:22 crc kubenswrapper[4788]: I0219 09:30:22.139965 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:30:22 crc kubenswrapper[4788]: I0219 09:30:22.140032 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:30:22 crc kubenswrapper[4788]: I0219 09:30:22.141539 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c0c08df672c69038ccb3b11abd1c4a577ae594aa382ece99e7a2cf62ee3717b"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:30:22 crc kubenswrapper[4788]: I0219 09:30:22.141632 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://7c0c08df672c69038ccb3b11abd1c4a577ae594aa382ece99e7a2cf62ee3717b" gracePeriod=600 Feb 19 09:30:23 crc kubenswrapper[4788]: I0219 09:30:23.196893 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="7c0c08df672c69038ccb3b11abd1c4a577ae594aa382ece99e7a2cf62ee3717b" exitCode=0 Feb 19 09:30:23 crc kubenswrapper[4788]: I0219 09:30:23.196965 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"7c0c08df672c69038ccb3b11abd1c4a577ae594aa382ece99e7a2cf62ee3717b"} Feb 19 09:30:23 crc kubenswrapper[4788]: I0219 09:30:23.197587 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69"} Feb 19 09:30:23 crc kubenswrapper[4788]: I0219 09:30:23.197616 4788 scope.go:117] "RemoveContainer" containerID="294fa22b686aa3a21ffb41a3eeed9e3ad0ea98ff638daa11ee38966c8ea2ce91" Feb 19 09:30:24 crc kubenswrapper[4788]: I0219 09:30:24.110136 4788 scope.go:117] "RemoveContainer" containerID="358449ecb2f9cdb5274a35d672b21a1e0a91b14ff53f4bd28527d24009a3d03e" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.118477 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s72kf"] Feb 19 09:30:51 crc kubenswrapper[4788]: E0219 09:30:51.119492 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71fd4bc0-24c4-4c75-8fee-932761e910c2" containerName="collect-profiles" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.119512 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="71fd4bc0-24c4-4c75-8fee-932761e910c2" containerName="collect-profiles" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.119739 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="71fd4bc0-24c4-4c75-8fee-932761e910c2" containerName="collect-profiles" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.121438 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.132050 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s72kf"] Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.259535 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzqtx\" (UniqueName: \"kubernetes.io/projected/7491bad1-552f-4e5b-b91b-8e734eb92326-kube-api-access-zzqtx\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.259588 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-utilities\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.259625 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-catalog-content\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.361312 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzqtx\" (UniqueName: \"kubernetes.io/projected/7491bad1-552f-4e5b-b91b-8e734eb92326-kube-api-access-zzqtx\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.361361 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-utilities\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.361400 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-catalog-content\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.362044 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-utilities\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.362164 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-catalog-content\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.385254 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzqtx\" (UniqueName: \"kubernetes.io/projected/7491bad1-552f-4e5b-b91b-8e734eb92326-kube-api-access-zzqtx\") pod \"community-operators-s72kf\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:51 crc kubenswrapper[4788]: I0219 09:30:51.473346 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:30:52 crc kubenswrapper[4788]: I0219 09:30:52.033290 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s72kf"] Feb 19 09:30:52 crc kubenswrapper[4788]: I0219 09:30:52.459857 4788 generic.go:334] "Generic (PLEG): container finished" podID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerID="6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b" exitCode=0 Feb 19 09:30:52 crc kubenswrapper[4788]: I0219 09:30:52.460025 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s72kf" event={"ID":"7491bad1-552f-4e5b-b91b-8e734eb92326","Type":"ContainerDied","Data":"6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b"} Feb 19 09:30:52 crc kubenswrapper[4788]: I0219 09:30:52.460183 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s72kf" event={"ID":"7491bad1-552f-4e5b-b91b-8e734eb92326","Type":"ContainerStarted","Data":"71e15221bfadd5e07b9c72e143243ec0c57432ef666f959b719670aca08fd3d6"} Feb 19 09:30:53 crc kubenswrapper[4788]: I0219 09:30:53.477225 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s72kf" event={"ID":"7491bad1-552f-4e5b-b91b-8e734eb92326","Type":"ContainerStarted","Data":"2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746"} Feb 19 09:30:58 crc kubenswrapper[4788]: I0219 09:30:58.538039 4788 generic.go:334] "Generic (PLEG): container finished" podID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerID="2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746" exitCode=0 Feb 19 09:30:58 crc kubenswrapper[4788]: I0219 09:30:58.538129 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s72kf" event={"ID":"7491bad1-552f-4e5b-b91b-8e734eb92326","Type":"ContainerDied","Data":"2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746"} Feb 19 09:30:59 crc kubenswrapper[4788]: I0219 09:30:59.548820 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s72kf" event={"ID":"7491bad1-552f-4e5b-b91b-8e734eb92326","Type":"ContainerStarted","Data":"0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e"} Feb 19 09:30:59 crc kubenswrapper[4788]: I0219 09:30:59.569299 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s72kf" podStartSLOduration=1.966798117 podStartE2EDuration="8.569282685s" podCreationTimestamp="2026-02-19 09:30:51 +0000 UTC" firstStartedPulling="2026-02-19 09:30:52.462283569 +0000 UTC m=+2754.450295041" lastFinishedPulling="2026-02-19 09:30:59.064768137 +0000 UTC m=+2761.052779609" observedRunningTime="2026-02-19 09:30:59.565597622 +0000 UTC m=+2761.553609094" watchObservedRunningTime="2026-02-19 09:30:59.569282685 +0000 UTC m=+2761.557294157" Feb 19 09:31:01 crc kubenswrapper[4788]: I0219 09:31:01.473512 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:31:01 crc kubenswrapper[4788]: I0219 09:31:01.474101 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:31:02 crc kubenswrapper[4788]: I0219 09:31:02.522068 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-s72kf" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="registry-server" probeResult="failure" output=< Feb 19 09:31:02 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:31:02 crc kubenswrapper[4788]: > Feb 19 09:31:11 crc kubenswrapper[4788]: I0219 09:31:11.528052 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:31:11 crc kubenswrapper[4788]: I0219 09:31:11.592113 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:31:11 crc kubenswrapper[4788]: I0219 09:31:11.763165 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s72kf"] Feb 19 09:31:12 crc kubenswrapper[4788]: I0219 09:31:12.696722 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s72kf" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="registry-server" containerID="cri-o://0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e" gracePeriod=2 Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.397082 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.547033 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzqtx\" (UniqueName: \"kubernetes.io/projected/7491bad1-552f-4e5b-b91b-8e734eb92326-kube-api-access-zzqtx\") pod \"7491bad1-552f-4e5b-b91b-8e734eb92326\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.547627 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-catalog-content\") pod \"7491bad1-552f-4e5b-b91b-8e734eb92326\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.547870 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-utilities\") pod \"7491bad1-552f-4e5b-b91b-8e734eb92326\" (UID: \"7491bad1-552f-4e5b-b91b-8e734eb92326\") " Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.548579 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-utilities" (OuterVolumeSpecName: "utilities") pod "7491bad1-552f-4e5b-b91b-8e734eb92326" (UID: "7491bad1-552f-4e5b-b91b-8e734eb92326"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.548987 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.552650 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7491bad1-552f-4e5b-b91b-8e734eb92326-kube-api-access-zzqtx" (OuterVolumeSpecName: "kube-api-access-zzqtx") pod "7491bad1-552f-4e5b-b91b-8e734eb92326" (UID: "7491bad1-552f-4e5b-b91b-8e734eb92326"). InnerVolumeSpecName "kube-api-access-zzqtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.611227 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7491bad1-552f-4e5b-b91b-8e734eb92326" (UID: "7491bad1-552f-4e5b-b91b-8e734eb92326"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.651742 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzqtx\" (UniqueName: \"kubernetes.io/projected/7491bad1-552f-4e5b-b91b-8e734eb92326-kube-api-access-zzqtx\") on node \"crc\" DevicePath \"\"" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.652383 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491bad1-552f-4e5b-b91b-8e734eb92326-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.710537 4788 generic.go:334] "Generic (PLEG): container finished" podID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerID="0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e" exitCode=0 Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.710600 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s72kf" event={"ID":"7491bad1-552f-4e5b-b91b-8e734eb92326","Type":"ContainerDied","Data":"0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e"} Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.710646 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s72kf" event={"ID":"7491bad1-552f-4e5b-b91b-8e734eb92326","Type":"ContainerDied","Data":"71e15221bfadd5e07b9c72e143243ec0c57432ef666f959b719670aca08fd3d6"} Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.710653 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s72kf" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.710677 4788 scope.go:117] "RemoveContainer" containerID="0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.737748 4788 scope.go:117] "RemoveContainer" containerID="2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.779702 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s72kf"] Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.779819 4788 scope.go:117] "RemoveContainer" containerID="6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.797721 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s72kf"] Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.839359 4788 scope.go:117] "RemoveContainer" containerID="0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e" Feb 19 09:31:13 crc kubenswrapper[4788]: E0219 09:31:13.840011 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e\": container with ID starting with 0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e not found: ID does not exist" containerID="0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.840040 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e"} err="failed to get container status \"0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e\": rpc error: code = NotFound desc = could not find container \"0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e\": container with ID starting with 0e8270a866acb49d28f03ce1362a66e93d18b0e436898a59b13feb1c0c1b2b8e not found: ID does not exist" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.840064 4788 scope.go:117] "RemoveContainer" containerID="2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746" Feb 19 09:31:13 crc kubenswrapper[4788]: E0219 09:31:13.840399 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746\": container with ID starting with 2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746 not found: ID does not exist" containerID="2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.840532 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746"} err="failed to get container status \"2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746\": rpc error: code = NotFound desc = could not find container \"2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746\": container with ID starting with 2b52e2254eeaf661e7f9ae21056a089186cece4dc04fb7e19282213b44342746 not found: ID does not exist" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.840667 4788 scope.go:117] "RemoveContainer" containerID="6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b" Feb 19 09:31:13 crc kubenswrapper[4788]: E0219 09:31:13.841092 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b\": container with ID starting with 6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b not found: ID does not exist" containerID="6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b" Feb 19 09:31:13 crc kubenswrapper[4788]: I0219 09:31:13.841118 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b"} err="failed to get container status \"6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b\": rpc error: code = NotFound desc = could not find container \"6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b\": container with ID starting with 6d98acae83c31ad71f055eeeb0efdffcd46744019309ff805de6e378d924686b not found: ID does not exist" Feb 19 09:31:14 crc kubenswrapper[4788]: I0219 09:31:14.731958 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" path="/var/lib/kubelet/pods/7491bad1-552f-4e5b-b91b-8e734eb92326/volumes" Feb 19 09:32:22 crc kubenswrapper[4788]: I0219 09:32:22.138994 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:32:22 crc kubenswrapper[4788]: I0219 09:32:22.139495 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:32:52 crc kubenswrapper[4788]: I0219 09:32:52.139140 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:32:52 crc kubenswrapper[4788]: I0219 09:32:52.139905 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.553707 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-plzbq"] Feb 19 09:33:09 crc kubenswrapper[4788]: E0219 09:33:09.554572 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="registry-server" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.554586 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="registry-server" Feb 19 09:33:09 crc kubenswrapper[4788]: E0219 09:33:09.554595 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="extract-utilities" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.554601 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="extract-utilities" Feb 19 09:33:09 crc kubenswrapper[4788]: E0219 09:33:09.554618 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="extract-content" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.554624 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="extract-content" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.554795 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="7491bad1-552f-4e5b-b91b-8e734eb92326" containerName="registry-server" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.556121 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.581529 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzbq"] Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.729086 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-catalog-content\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.729176 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-utilities\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.729216 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldqnd\" (UniqueName: \"kubernetes.io/projected/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-kube-api-access-ldqnd\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.831234 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-catalog-content\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.831555 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-utilities\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.831986 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-catalog-content\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.832019 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-utilities\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.832092 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldqnd\" (UniqueName: \"kubernetes.io/projected/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-kube-api-access-ldqnd\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.858656 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldqnd\" (UniqueName: \"kubernetes.io/projected/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-kube-api-access-ldqnd\") pod \"redhat-marketplace-plzbq\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:09 crc kubenswrapper[4788]: I0219 09:33:09.887716 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:10 crc kubenswrapper[4788]: W0219 09:33:10.383821 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25653ae6_4998_49c3_a1a0_1f1ed41cb6ef.slice/crio-f2305ce6e5c4ddee901f623e63d8d85c6a8b980a1a1ab289dc0cf265b5fba3f1 WatchSource:0}: Error finding container f2305ce6e5c4ddee901f623e63d8d85c6a8b980a1a1ab289dc0cf265b5fba3f1: Status 404 returned error can't find the container with id f2305ce6e5c4ddee901f623e63d8d85c6a8b980a1a1ab289dc0cf265b5fba3f1 Feb 19 09:33:10 crc kubenswrapper[4788]: I0219 09:33:10.391212 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzbq"] Feb 19 09:33:10 crc kubenswrapper[4788]: I0219 09:33:10.930275 4788 generic.go:334] "Generic (PLEG): container finished" podID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerID="91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d" exitCode=0 Feb 19 09:33:10 crc kubenswrapper[4788]: I0219 09:33:10.930329 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzbq" event={"ID":"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef","Type":"ContainerDied","Data":"91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d"} Feb 19 09:33:10 crc kubenswrapper[4788]: I0219 09:33:10.930648 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzbq" event={"ID":"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef","Type":"ContainerStarted","Data":"f2305ce6e5c4ddee901f623e63d8d85c6a8b980a1a1ab289dc0cf265b5fba3f1"} Feb 19 09:33:10 crc kubenswrapper[4788]: I0219 09:33:10.933615 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:33:11 crc kubenswrapper[4788]: I0219 09:33:11.940084 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzbq" event={"ID":"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef","Type":"ContainerStarted","Data":"27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b"} Feb 19 09:33:12 crc kubenswrapper[4788]: I0219 09:33:12.952457 4788 generic.go:334] "Generic (PLEG): container finished" podID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerID="27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b" exitCode=0 Feb 19 09:33:12 crc kubenswrapper[4788]: I0219 09:33:12.952566 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzbq" event={"ID":"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef","Type":"ContainerDied","Data":"27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b"} Feb 19 09:33:13 crc kubenswrapper[4788]: I0219 09:33:13.972841 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzbq" event={"ID":"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef","Type":"ContainerStarted","Data":"328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e"} Feb 19 09:33:14 crc kubenswrapper[4788]: I0219 09:33:14.002860 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-plzbq" podStartSLOduration=2.525190697 podStartE2EDuration="5.002840556s" podCreationTimestamp="2026-02-19 09:33:09 +0000 UTC" firstStartedPulling="2026-02-19 09:33:10.933346528 +0000 UTC m=+2892.921358010" lastFinishedPulling="2026-02-19 09:33:13.410996407 +0000 UTC m=+2895.399007869" observedRunningTime="2026-02-19 09:33:14.000179559 +0000 UTC m=+2895.988191091" watchObservedRunningTime="2026-02-19 09:33:14.002840556 +0000 UTC m=+2895.990852028" Feb 19 09:33:19 crc kubenswrapper[4788]: I0219 09:33:19.888061 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:19 crc kubenswrapper[4788]: I0219 09:33:19.888492 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:19 crc kubenswrapper[4788]: I0219 09:33:19.951378 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:20 crc kubenswrapper[4788]: I0219 09:33:20.068554 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:20 crc kubenswrapper[4788]: I0219 09:33:20.192869 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzbq"] Feb 19 09:33:22 crc kubenswrapper[4788]: I0219 09:33:22.044300 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-plzbq" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerName="registry-server" containerID="cri-o://328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e" gracePeriod=2 Feb 19 09:33:22 crc kubenswrapper[4788]: I0219 09:33:22.139190 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:33:22 crc kubenswrapper[4788]: I0219 09:33:22.139309 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:33:22 crc kubenswrapper[4788]: I0219 09:33:22.139380 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:33:22 crc kubenswrapper[4788]: I0219 09:33:22.140421 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:33:22 crc kubenswrapper[4788]: I0219 09:33:22.140523 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" gracePeriod=600 Feb 19 09:33:22 crc kubenswrapper[4788]: E0219 09:33:22.789512 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.027208 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.055422 4788 generic.go:334] "Generic (PLEG): container finished" podID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerID="328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e" exitCode=0 Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.055517 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzbq" event={"ID":"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef","Type":"ContainerDied","Data":"328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e"} Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.055561 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-plzbq" event={"ID":"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef","Type":"ContainerDied","Data":"f2305ce6e5c4ddee901f623e63d8d85c6a8b980a1a1ab289dc0cf265b5fba3f1"} Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.055591 4788 scope.go:117] "RemoveContainer" containerID="328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.055770 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-plzbq" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.071572 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" exitCode=0 Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.071647 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69"} Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.074766 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:33:23 crc kubenswrapper[4788]: E0219 09:33:23.076507 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.088088 4788 scope.go:117] "RemoveContainer" containerID="27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.118942 4788 scope.go:117] "RemoveContainer" containerID="91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.148684 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-catalog-content\") pod \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.148735 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-utilities\") pod \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.148919 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldqnd\" (UniqueName: \"kubernetes.io/projected/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-kube-api-access-ldqnd\") pod \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\" (UID: \"25653ae6-4998-49c3-a1a0-1f1ed41cb6ef\") " Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.150020 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-utilities" (OuterVolumeSpecName: "utilities") pod "25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" (UID: "25653ae6-4998-49c3-a1a0-1f1ed41cb6ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.150815 4788 scope.go:117] "RemoveContainer" containerID="328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e" Feb 19 09:33:23 crc kubenswrapper[4788]: E0219 09:33:23.151453 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e\": container with ID starting with 328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e not found: ID does not exist" containerID="328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.151491 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e"} err="failed to get container status \"328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e\": rpc error: code = NotFound desc = could not find container \"328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e\": container with ID starting with 328f3a339bbdc6acebaf9c5cc3c632c06a5dcd93ebb8f54c163259d0bf87e79e not found: ID does not exist" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.151520 4788 scope.go:117] "RemoveContainer" containerID="27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b" Feb 19 09:33:23 crc kubenswrapper[4788]: E0219 09:33:23.151841 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b\": container with ID starting with 27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b not found: ID does not exist" containerID="27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.151883 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b"} err="failed to get container status \"27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b\": rpc error: code = NotFound desc = could not find container \"27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b\": container with ID starting with 27cf5141c6eaf858751f46fb955fd246da3431798744b6b00deb4d95b336293b not found: ID does not exist" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.151898 4788 scope.go:117] "RemoveContainer" containerID="91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d" Feb 19 09:33:23 crc kubenswrapper[4788]: E0219 09:33:23.152316 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d\": container with ID starting with 91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d not found: ID does not exist" containerID="91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.152347 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d"} err="failed to get container status \"91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d\": rpc error: code = NotFound desc = could not find container \"91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d\": container with ID starting with 91358fb93c57ea14204080d65354b558c62e383de5ac71760170cfb364579d0d not found: ID does not exist" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.152393 4788 scope.go:117] "RemoveContainer" containerID="7c0c08df672c69038ccb3b11abd1c4a577ae594aa382ece99e7a2cf62ee3717b" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.154980 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-kube-api-access-ldqnd" (OuterVolumeSpecName: "kube-api-access-ldqnd") pod "25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" (UID: "25653ae6-4998-49c3-a1a0-1f1ed41cb6ef"). InnerVolumeSpecName "kube-api-access-ldqnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.184671 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" (UID: "25653ae6-4998-49c3-a1a0-1f1ed41cb6ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.250944 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldqnd\" (UniqueName: \"kubernetes.io/projected/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-kube-api-access-ldqnd\") on node \"crc\" DevicePath \"\"" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.250971 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.250984 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.399396 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzbq"] Feb 19 09:33:23 crc kubenswrapper[4788]: I0219 09:33:23.407880 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-plzbq"] Feb 19 09:33:23 crc kubenswrapper[4788]: E0219 09:33:23.523819 4788 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25653ae6_4998_49c3_a1a0_1f1ed41cb6ef.slice\": RecentStats: unable to find data in memory cache]" Feb 19 09:33:24 crc kubenswrapper[4788]: I0219 09:33:24.736068 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" path="/var/lib/kubelet/pods/25653ae6-4998-49c3-a1a0-1f1ed41cb6ef/volumes" Feb 19 09:33:36 crc kubenswrapper[4788]: I0219 09:33:36.714950 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:33:36 crc kubenswrapper[4788]: E0219 09:33:36.716080 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:33:47 crc kubenswrapper[4788]: I0219 09:33:47.714899 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:33:47 crc kubenswrapper[4788]: E0219 09:33:47.716629 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:34:01 crc kubenswrapper[4788]: I0219 09:34:01.715459 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:34:01 crc kubenswrapper[4788]: E0219 09:34:01.716520 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:34:16 crc kubenswrapper[4788]: I0219 09:34:16.715165 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:34:16 crc kubenswrapper[4788]: E0219 09:34:16.716326 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:34:31 crc kubenswrapper[4788]: I0219 09:34:31.715262 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:34:31 crc kubenswrapper[4788]: E0219 09:34:31.715934 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:34:44 crc kubenswrapper[4788]: I0219 09:34:44.714549 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:34:44 crc kubenswrapper[4788]: E0219 09:34:44.715280 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:34:58 crc kubenswrapper[4788]: I0219 09:34:58.732981 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:34:58 crc kubenswrapper[4788]: E0219 09:34:58.733800 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:35:13 crc kubenswrapper[4788]: I0219 09:35:13.714714 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:35:13 crc kubenswrapper[4788]: E0219 09:35:13.715786 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:35:28 crc kubenswrapper[4788]: I0219 09:35:28.720826 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:35:28 crc kubenswrapper[4788]: E0219 09:35:28.721873 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:35:39 crc kubenswrapper[4788]: I0219 09:35:39.714174 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:35:39 crc kubenswrapper[4788]: E0219 09:35:39.714883 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:35:54 crc kubenswrapper[4788]: I0219 09:35:54.714573 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:35:54 crc kubenswrapper[4788]: E0219 09:35:54.715953 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:36:05 crc kubenswrapper[4788]: I0219 09:36:05.714726 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:36:05 crc kubenswrapper[4788]: E0219 09:36:05.715581 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:36:16 crc kubenswrapper[4788]: I0219 09:36:16.715141 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:36:16 crc kubenswrapper[4788]: E0219 09:36:16.716527 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:36:28 crc kubenswrapper[4788]: I0219 09:36:28.728597 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:36:28 crc kubenswrapper[4788]: E0219 09:36:28.729571 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:36:40 crc kubenswrapper[4788]: I0219 09:36:40.714635 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:36:40 crc kubenswrapper[4788]: E0219 09:36:40.715475 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:36:53 crc kubenswrapper[4788]: I0219 09:36:53.714704 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:36:53 crc kubenswrapper[4788]: E0219 09:36:53.715592 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:37:04 crc kubenswrapper[4788]: I0219 09:37:04.714819 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:37:04 crc kubenswrapper[4788]: E0219 09:37:04.715857 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:37:15 crc kubenswrapper[4788]: I0219 09:37:15.715051 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:37:15 crc kubenswrapper[4788]: E0219 09:37:15.715940 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:37:26 crc kubenswrapper[4788]: I0219 09:37:26.715025 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:37:26 crc kubenswrapper[4788]: E0219 09:37:26.716298 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:37:39 crc kubenswrapper[4788]: I0219 09:37:39.715561 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:37:39 crc kubenswrapper[4788]: E0219 09:37:39.716686 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:37:53 crc kubenswrapper[4788]: I0219 09:37:53.715087 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:37:53 crc kubenswrapper[4788]: E0219 09:37:53.716449 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:38:08 crc kubenswrapper[4788]: I0219 09:38:08.721193 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:38:08 crc kubenswrapper[4788]: E0219 09:38:08.722175 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:38:23 crc kubenswrapper[4788]: I0219 09:38:23.714777 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:38:24 crc kubenswrapper[4788]: I0219 09:38:24.260589 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"3e559d6ada7592b7d324b68c3abdf68767d50a4eff6ea2aef554a1723fbf1a30"} Feb 19 09:39:00 crc kubenswrapper[4788]: I0219 09:39:00.666697 4788 generic.go:334] "Generic (PLEG): container finished" podID="59d3eda3-8975-46b8-8cfa-27b4dcd210f7" containerID="39eacaf87e7b12eaaed6b022b02900c466a885730b35fc3997ca205c18c06627" exitCode=1 Feb 19 09:39:00 crc kubenswrapper[4788]: I0219 09:39:00.666801 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59d3eda3-8975-46b8-8cfa-27b4dcd210f7","Type":"ContainerDied","Data":"39eacaf87e7b12eaaed6b022b02900c466a885730b35fc3997ca205c18c06627"} Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.351061 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.422565 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.422676 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ca-certs\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.422744 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config-secret\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.422789 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-config-data\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.422871 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-temporary\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.423025 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.423065 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ssh-key\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.423120 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-workdir\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.423216 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cn6b\" (UniqueName: \"kubernetes.io/projected/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-kube-api-access-4cn6b\") pod \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\" (UID: \"59d3eda3-8975-46b8-8cfa-27b4dcd210f7\") " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.423795 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.423881 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-config-data" (OuterVolumeSpecName: "config-data") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.429906 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-kube-api-access-4cn6b" (OuterVolumeSpecName: "kube-api-access-4cn6b") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "kube-api-access-4cn6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.434503 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.435359 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "test-operator-logs") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.452540 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.474998 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.479411 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.496536 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "59d3eda3-8975-46b8-8cfa-27b4dcd210f7" (UID: "59d3eda3-8975-46b8-8cfa-27b4dcd210f7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.525989 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.526026 4788 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.526040 4788 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.526057 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cn6b\" (UniqueName: \"kubernetes.io/projected/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-kube-api-access-4cn6b\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.526107 4788 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.526120 4788 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.526131 4788 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.526141 4788 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-config-data\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.526152 4788 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/59d3eda3-8975-46b8-8cfa-27b4dcd210f7-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.546029 4788 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.627989 4788 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.691504 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"59d3eda3-8975-46b8-8cfa-27b4dcd210f7","Type":"ContainerDied","Data":"89cb4b5ba55c51485dd6e66de4f1282fc9be45293d42aedca5db3fd5fb643797"} Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.691561 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89cb4b5ba55c51485dd6e66de4f1282fc9be45293d42aedca5db3fd5fb643797" Feb 19 09:39:02 crc kubenswrapper[4788]: I0219 09:39:02.691614 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.850445 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 09:39:04 crc kubenswrapper[4788]: E0219 09:39:04.851459 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerName="registry-server" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.851484 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerName="registry-server" Feb 19 09:39:04 crc kubenswrapper[4788]: E0219 09:39:04.851507 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerName="extract-utilities" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.851520 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerName="extract-utilities" Feb 19 09:39:04 crc kubenswrapper[4788]: E0219 09:39:04.851550 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59d3eda3-8975-46b8-8cfa-27b4dcd210f7" containerName="tempest-tests-tempest-tests-runner" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.851563 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="59d3eda3-8975-46b8-8cfa-27b4dcd210f7" containerName="tempest-tests-tempest-tests-runner" Feb 19 09:39:04 crc kubenswrapper[4788]: E0219 09:39:04.851621 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerName="extract-content" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.851633 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerName="extract-content" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.851975 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="25653ae6-4998-49c3-a1a0-1f1ed41cb6ef" containerName="registry-server" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.852008 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="59d3eda3-8975-46b8-8cfa-27b4dcd210f7" containerName="tempest-tests-tempest-tests-runner" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.853043 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.861709 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 09:39:04 crc kubenswrapper[4788]: I0219 09:39:04.895289 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-tfph6" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.006558 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8wwx\" (UniqueName: \"kubernetes.io/projected/03eee958-9fa0-4e8d-8f47-a40b7fab0b78-kube-api-access-d8wwx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03eee958-9fa0-4e8d-8f47-a40b7fab0b78\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.006863 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03eee958-9fa0-4e8d-8f47-a40b7fab0b78\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.108939 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8wwx\" (UniqueName: \"kubernetes.io/projected/03eee958-9fa0-4e8d-8f47-a40b7fab0b78-kube-api-access-d8wwx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03eee958-9fa0-4e8d-8f47-a40b7fab0b78\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.109089 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03eee958-9fa0-4e8d-8f47-a40b7fab0b78\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.109540 4788 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03eee958-9fa0-4e8d-8f47-a40b7fab0b78\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.142514 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8wwx\" (UniqueName: \"kubernetes.io/projected/03eee958-9fa0-4e8d-8f47-a40b7fab0b78-kube-api-access-d8wwx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03eee958-9fa0-4e8d-8f47-a40b7fab0b78\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.146883 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"03eee958-9fa0-4e8d-8f47-a40b7fab0b78\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.223237 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.715390 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.722737 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 19 09:39:05 crc kubenswrapper[4788]: I0219 09:39:05.729815 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"03eee958-9fa0-4e8d-8f47-a40b7fab0b78","Type":"ContainerStarted","Data":"c1f2449bde47cfdeacc513cfb9710f394410ac0c92beecaa08fe46490509bc4b"} Feb 19 09:39:07 crc kubenswrapper[4788]: I0219 09:39:07.759826 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"03eee958-9fa0-4e8d-8f47-a40b7fab0b78","Type":"ContainerStarted","Data":"d9a35c6109753f5b74871c6b0a483758a0c1eefd6ceb26ecfacff3b67367931b"} Feb 19 09:39:07 crc kubenswrapper[4788]: I0219 09:39:07.787265 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.960546983 podStartE2EDuration="3.787221873s" podCreationTimestamp="2026-02-19 09:39:04 +0000 UTC" firstStartedPulling="2026-02-19 09:39:05.712405967 +0000 UTC m=+3247.700417449" lastFinishedPulling="2026-02-19 09:39:06.539080857 +0000 UTC m=+3248.527092339" observedRunningTime="2026-02-19 09:39:07.783762894 +0000 UTC m=+3249.771774406" watchObservedRunningTime="2026-02-19 09:39:07.787221873 +0000 UTC m=+3249.775233375" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.194131 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6snts"] Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.197650 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.215540 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6snts"] Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.328483 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-catalog-content\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.328541 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl45b\" (UniqueName: \"kubernetes.io/projected/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-kube-api-access-gl45b\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.328683 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-utilities\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.430736 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-utilities\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.430986 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-catalog-content\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.431426 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-utilities\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.431579 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-catalog-content\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.431645 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl45b\" (UniqueName: \"kubernetes.io/projected/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-kube-api-access-gl45b\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.464355 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl45b\" (UniqueName: \"kubernetes.io/projected/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-kube-api-access-gl45b\") pod \"redhat-operators-6snts\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:10 crc kubenswrapper[4788]: I0219 09:39:10.539623 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:11 crc kubenswrapper[4788]: I0219 09:39:11.093011 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6snts"] Feb 19 09:39:11 crc kubenswrapper[4788]: I0219 09:39:11.806889 4788 generic.go:334] "Generic (PLEG): container finished" podID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerID="6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce" exitCode=0 Feb 19 09:39:11 crc kubenswrapper[4788]: I0219 09:39:11.807334 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6snts" event={"ID":"e4620826-2dd9-4df6-a4cb-bbdc24a672cd","Type":"ContainerDied","Data":"6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce"} Feb 19 09:39:11 crc kubenswrapper[4788]: I0219 09:39:11.807376 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6snts" event={"ID":"e4620826-2dd9-4df6-a4cb-bbdc24a672cd","Type":"ContainerStarted","Data":"3955141d6a7da1033d746329e94e8984af732c6428a080b36fb90105689c97b9"} Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.599741 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-llnd9"] Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.602928 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.650218 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llnd9"] Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.677705 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhcdb\" (UniqueName: \"kubernetes.io/projected/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-kube-api-access-fhcdb\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.677936 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-utilities\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.678194 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-catalog-content\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.780126 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-catalog-content\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.780330 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhcdb\" (UniqueName: \"kubernetes.io/projected/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-kube-api-access-fhcdb\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.780356 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-utilities\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.780905 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-catalog-content\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.781040 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-utilities\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.802407 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhcdb\" (UniqueName: \"kubernetes.io/projected/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-kube-api-access-fhcdb\") pod \"certified-operators-llnd9\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.819015 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6snts" event={"ID":"e4620826-2dd9-4df6-a4cb-bbdc24a672cd","Type":"ContainerStarted","Data":"dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999"} Feb 19 09:39:12 crc kubenswrapper[4788]: I0219 09:39:12.992807 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:13 crc kubenswrapper[4788]: I0219 09:39:13.538148 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-llnd9"] Feb 19 09:39:13 crc kubenswrapper[4788]: I0219 09:39:13.833131 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llnd9" event={"ID":"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15","Type":"ContainerStarted","Data":"e2005c471a15b2187f5d5cead81f88d724c2f00e0d2dcb2e8678f640ebeee3a5"} Feb 19 09:39:14 crc kubenswrapper[4788]: I0219 09:39:14.843729 4788 generic.go:334] "Generic (PLEG): container finished" podID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerID="a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0" exitCode=0 Feb 19 09:39:14 crc kubenswrapper[4788]: I0219 09:39:14.843797 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llnd9" event={"ID":"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15","Type":"ContainerDied","Data":"a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0"} Feb 19 09:39:15 crc kubenswrapper[4788]: I0219 09:39:15.856859 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llnd9" event={"ID":"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15","Type":"ContainerStarted","Data":"e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346"} Feb 19 09:39:20 crc kubenswrapper[4788]: I0219 09:39:20.922117 4788 generic.go:334] "Generic (PLEG): container finished" podID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerID="dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999" exitCode=0 Feb 19 09:39:20 crc kubenswrapper[4788]: I0219 09:39:20.922168 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6snts" event={"ID":"e4620826-2dd9-4df6-a4cb-bbdc24a672cd","Type":"ContainerDied","Data":"dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999"} Feb 19 09:39:20 crc kubenswrapper[4788]: I0219 09:39:20.926656 4788 generic.go:334] "Generic (PLEG): container finished" podID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerID="e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346" exitCode=0 Feb 19 09:39:20 crc kubenswrapper[4788]: I0219 09:39:20.926720 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llnd9" event={"ID":"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15","Type":"ContainerDied","Data":"e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346"} Feb 19 09:39:22 crc kubenswrapper[4788]: I0219 09:39:22.949695 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llnd9" event={"ID":"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15","Type":"ContainerStarted","Data":"cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940"} Feb 19 09:39:22 crc kubenswrapper[4788]: I0219 09:39:22.952931 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6snts" event={"ID":"e4620826-2dd9-4df6-a4cb-bbdc24a672cd","Type":"ContainerStarted","Data":"bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b"} Feb 19 09:39:22 crc kubenswrapper[4788]: I0219 09:39:22.984500 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-llnd9" podStartSLOduration=3.731646907 podStartE2EDuration="10.984480692s" podCreationTimestamp="2026-02-19 09:39:12 +0000 UTC" firstStartedPulling="2026-02-19 09:39:14.845527519 +0000 UTC m=+3256.833538991" lastFinishedPulling="2026-02-19 09:39:22.098361264 +0000 UTC m=+3264.086372776" observedRunningTime="2026-02-19 09:39:22.976758872 +0000 UTC m=+3264.964770364" watchObservedRunningTime="2026-02-19 09:39:22.984480692 +0000 UTC m=+3264.972492174" Feb 19 09:39:22 crc kubenswrapper[4788]: I0219 09:39:22.994020 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:22 crc kubenswrapper[4788]: I0219 09:39:22.994099 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:23 crc kubenswrapper[4788]: I0219 09:39:23.003176 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6snts" podStartSLOduration=2.759473121 podStartE2EDuration="13.003154204s" podCreationTimestamp="2026-02-19 09:39:10 +0000 UTC" firstStartedPulling="2026-02-19 09:39:11.809442342 +0000 UTC m=+3253.797453844" lastFinishedPulling="2026-02-19 09:39:22.053123425 +0000 UTC m=+3264.041134927" observedRunningTime="2026-02-19 09:39:23.00221685 +0000 UTC m=+3264.990228322" watchObservedRunningTime="2026-02-19 09:39:23.003154204 +0000 UTC m=+3264.991165686" Feb 19 09:39:24 crc kubenswrapper[4788]: I0219 09:39:24.053200 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-llnd9" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="registry-server" probeResult="failure" output=< Feb 19 09:39:24 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:39:24 crc kubenswrapper[4788]: > Feb 19 09:39:30 crc kubenswrapper[4788]: I0219 09:39:30.539792 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:30 crc kubenswrapper[4788]: I0219 09:39:30.540508 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:30 crc kubenswrapper[4788]: I0219 09:39:30.626049 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:31 crc kubenswrapper[4788]: I0219 09:39:31.103492 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:31 crc kubenswrapper[4788]: I0219 09:39:31.166533 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6snts"] Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.062225 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6snts" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerName="registry-server" containerID="cri-o://bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b" gracePeriod=2 Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.070438 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.154159 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.567261 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.643837 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-catalog-content\") pod \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.644270 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-utilities\") pod \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.644301 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl45b\" (UniqueName: \"kubernetes.io/projected/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-kube-api-access-gl45b\") pod \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\" (UID: \"e4620826-2dd9-4df6-a4cb-bbdc24a672cd\") " Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.645902 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-utilities" (OuterVolumeSpecName: "utilities") pod "e4620826-2dd9-4df6-a4cb-bbdc24a672cd" (UID: "e4620826-2dd9-4df6-a4cb-bbdc24a672cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.649915 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-kube-api-access-gl45b" (OuterVolumeSpecName: "kube-api-access-gl45b") pod "e4620826-2dd9-4df6-a4cb-bbdc24a672cd" (UID: "e4620826-2dd9-4df6-a4cb-bbdc24a672cd"). InnerVolumeSpecName "kube-api-access-gl45b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.746335 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.746365 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl45b\" (UniqueName: \"kubernetes.io/projected/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-kube-api-access-gl45b\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.796375 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4620826-2dd9-4df6-a4cb-bbdc24a672cd" (UID: "e4620826-2dd9-4df6-a4cb-bbdc24a672cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:39:33 crc kubenswrapper[4788]: I0219 09:39:33.848901 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4620826-2dd9-4df6-a4cb-bbdc24a672cd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.075832 4788 generic.go:334] "Generic (PLEG): container finished" podID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerID="bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b" exitCode=0 Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.075986 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6snts" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.075932 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6snts" event={"ID":"e4620826-2dd9-4df6-a4cb-bbdc24a672cd","Type":"ContainerDied","Data":"bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b"} Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.076218 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6snts" event={"ID":"e4620826-2dd9-4df6-a4cb-bbdc24a672cd","Type":"ContainerDied","Data":"3955141d6a7da1033d746329e94e8984af732c6428a080b36fb90105689c97b9"} Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.076283 4788 scope.go:117] "RemoveContainer" containerID="bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.126661 4788 scope.go:117] "RemoveContainer" containerID="dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.149589 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6snts"] Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.164091 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6snts"] Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.170833 4788 scope.go:117] "RemoveContainer" containerID="6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.226534 4788 scope.go:117] "RemoveContainer" containerID="bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b" Feb 19 09:39:34 crc kubenswrapper[4788]: E0219 09:39:34.227436 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b\": container with ID starting with bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b not found: ID does not exist" containerID="bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.227487 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b"} err="failed to get container status \"bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b\": rpc error: code = NotFound desc = could not find container \"bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b\": container with ID starting with bfc171c36f3e21e8a6a2c41396802ef4e66e3af31005a073fad9a0282b7da40b not found: ID does not exist" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.227518 4788 scope.go:117] "RemoveContainer" containerID="dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999" Feb 19 09:39:34 crc kubenswrapper[4788]: E0219 09:39:34.227934 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999\": container with ID starting with dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999 not found: ID does not exist" containerID="dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.227964 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999"} err="failed to get container status \"dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999\": rpc error: code = NotFound desc = could not find container \"dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999\": container with ID starting with dfffb79915e6ca6d363aee26154bd36b4747f503b0adaae981695dd6fdb79999 not found: ID does not exist" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.227981 4788 scope.go:117] "RemoveContainer" containerID="6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce" Feb 19 09:39:34 crc kubenswrapper[4788]: E0219 09:39:34.228274 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce\": container with ID starting with 6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce not found: ID does not exist" containerID="6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.228301 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce"} err="failed to get container status \"6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce\": rpc error: code = NotFound desc = could not find container \"6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce\": container with ID starting with 6461d5dba0e7b1be212fee4ab422eb7ed85fd52341928213744b8f1516e07fce not found: ID does not exist" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.726407 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" path="/var/lib/kubelet/pods/e4620826-2dd9-4df6-a4cb-bbdc24a672cd/volumes" Feb 19 09:39:34 crc kubenswrapper[4788]: I0219 09:39:34.870511 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llnd9"] Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.088629 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-llnd9" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="registry-server" containerID="cri-o://cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940" gracePeriod=2 Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.122692 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sb7nx/must-gather-x22vs"] Feb 19 09:39:35 crc kubenswrapper[4788]: E0219 09:39:35.123553 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerName="registry-server" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.123581 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerName="registry-server" Feb 19 09:39:35 crc kubenswrapper[4788]: E0219 09:39:35.123620 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerName="extract-utilities" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.123634 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerName="extract-utilities" Feb 19 09:39:35 crc kubenswrapper[4788]: E0219 09:39:35.123660 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerName="extract-content" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.123669 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerName="extract-content" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.124225 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4620826-2dd9-4df6-a4cb-bbdc24a672cd" containerName="registry-server" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.126423 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.129112 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sb7nx"/"openshift-service-ca.crt" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.129382 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sb7nx"/"kube-root-ca.crt" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.191710 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjktx\" (UniqueName: \"kubernetes.io/projected/e2edf248-6e05-430c-9c6e-070d15cbb9b9-kube-api-access-mjktx\") pod \"must-gather-x22vs\" (UID: \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\") " pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.191847 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2edf248-6e05-430c-9c6e-070d15cbb9b9-must-gather-output\") pod \"must-gather-x22vs\" (UID: \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\") " pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.224770 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sb7nx/must-gather-x22vs"] Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.294711 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2edf248-6e05-430c-9c6e-070d15cbb9b9-must-gather-output\") pod \"must-gather-x22vs\" (UID: \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\") " pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.294796 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjktx\" (UniqueName: \"kubernetes.io/projected/e2edf248-6e05-430c-9c6e-070d15cbb9b9-kube-api-access-mjktx\") pod \"must-gather-x22vs\" (UID: \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\") " pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.295164 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2edf248-6e05-430c-9c6e-070d15cbb9b9-must-gather-output\") pod \"must-gather-x22vs\" (UID: \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\") " pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.322423 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjktx\" (UniqueName: \"kubernetes.io/projected/e2edf248-6e05-430c-9c6e-070d15cbb9b9-kube-api-access-mjktx\") pod \"must-gather-x22vs\" (UID: \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\") " pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.462934 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.603310 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.703767 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhcdb\" (UniqueName: \"kubernetes.io/projected/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-kube-api-access-fhcdb\") pod \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.704289 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-catalog-content\") pod \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.704368 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-utilities\") pod \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\" (UID: \"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15\") " Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.705640 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-utilities" (OuterVolumeSpecName: "utilities") pod "ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" (UID: "ca2aa2f1-d0e3-4d88-a360-0a94a5516b15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.724950 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-kube-api-access-fhcdb" (OuterVolumeSpecName: "kube-api-access-fhcdb") pod "ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" (UID: "ca2aa2f1-d0e3-4d88-a360-0a94a5516b15"). InnerVolumeSpecName "kube-api-access-fhcdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.778219 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" (UID: "ca2aa2f1-d0e3-4d88-a360-0a94a5516b15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.808404 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.808437 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.808448 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhcdb\" (UniqueName: \"kubernetes.io/projected/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15-kube-api-access-fhcdb\") on node \"crc\" DevicePath \"\"" Feb 19 09:39:35 crc kubenswrapper[4788]: I0219 09:39:35.916955 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sb7nx/must-gather-x22vs"] Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.095262 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/must-gather-x22vs" event={"ID":"e2edf248-6e05-430c-9c6e-070d15cbb9b9","Type":"ContainerStarted","Data":"b7c67ba1b2cdc2f5057868e82eab58fe7e572bba6fcc6e97be33910cca7e19f9"} Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.096971 4788 generic.go:334] "Generic (PLEG): container finished" podID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerID="cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940" exitCode=0 Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.097014 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llnd9" event={"ID":"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15","Type":"ContainerDied","Data":"cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940"} Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.097059 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-llnd9" event={"ID":"ca2aa2f1-d0e3-4d88-a360-0a94a5516b15","Type":"ContainerDied","Data":"e2005c471a15b2187f5d5cead81f88d724c2f00e0d2dcb2e8678f640ebeee3a5"} Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.097076 4788 scope.go:117] "RemoveContainer" containerID="cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.097260 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-llnd9" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.122129 4788 scope.go:117] "RemoveContainer" containerID="e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.138479 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-llnd9"] Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.147067 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-llnd9"] Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.160719 4788 scope.go:117] "RemoveContainer" containerID="a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.224149 4788 scope.go:117] "RemoveContainer" containerID="cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940" Feb 19 09:39:36 crc kubenswrapper[4788]: E0219 09:39:36.224608 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940\": container with ID starting with cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940 not found: ID does not exist" containerID="cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.224642 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940"} err="failed to get container status \"cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940\": rpc error: code = NotFound desc = could not find container \"cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940\": container with ID starting with cace91c90b7d823c0e0da915d8119e5a8b33140449f6c078f16686909bcaa940 not found: ID does not exist" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.224670 4788 scope.go:117] "RemoveContainer" containerID="e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346" Feb 19 09:39:36 crc kubenswrapper[4788]: E0219 09:39:36.224926 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346\": container with ID starting with e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346 not found: ID does not exist" containerID="e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.224949 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346"} err="failed to get container status \"e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346\": rpc error: code = NotFound desc = could not find container \"e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346\": container with ID starting with e62566c24dc7239bb6814fd2d0afa717ef2e16fa4e1c44213bea2edf01c97346 not found: ID does not exist" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.224966 4788 scope.go:117] "RemoveContainer" containerID="a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0" Feb 19 09:39:36 crc kubenswrapper[4788]: E0219 09:39:36.225172 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0\": container with ID starting with a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0 not found: ID does not exist" containerID="a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.225192 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0"} err="failed to get container status \"a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0\": rpc error: code = NotFound desc = could not find container \"a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0\": container with ID starting with a2712ce57dd3834881378bd6e87bc934858c4c1997853b82811f0d780d3262d0 not found: ID does not exist" Feb 19 09:39:36 crc kubenswrapper[4788]: I0219 09:39:36.736901 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" path="/var/lib/kubelet/pods/ca2aa2f1-d0e3-4d88-a360-0a94a5516b15/volumes" Feb 19 09:39:43 crc kubenswrapper[4788]: I0219 09:39:43.165673 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/must-gather-x22vs" event={"ID":"e2edf248-6e05-430c-9c6e-070d15cbb9b9","Type":"ContainerStarted","Data":"4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb"} Feb 19 09:39:43 crc kubenswrapper[4788]: I0219 09:39:43.166186 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/must-gather-x22vs" event={"ID":"e2edf248-6e05-430c-9c6e-070d15cbb9b9","Type":"ContainerStarted","Data":"5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8"} Feb 19 09:39:43 crc kubenswrapper[4788]: I0219 09:39:43.189713 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sb7nx/must-gather-x22vs" podStartSLOduration=1.952887504 podStartE2EDuration="8.189680293s" podCreationTimestamp="2026-02-19 09:39:35 +0000 UTC" firstStartedPulling="2026-02-19 09:39:35.940064961 +0000 UTC m=+3277.928076443" lastFinishedPulling="2026-02-19 09:39:42.17685772 +0000 UTC m=+3284.164869232" observedRunningTime="2026-02-19 09:39:43.17637763 +0000 UTC m=+3285.164389102" watchObservedRunningTime="2026-02-19 09:39:43.189680293 +0000 UTC m=+3285.177691805" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.574364 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-pzf6g"] Feb 19 09:39:46 crc kubenswrapper[4788]: E0219 09:39:46.575315 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="extract-utilities" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.575334 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="extract-utilities" Feb 19 09:39:46 crc kubenswrapper[4788]: E0219 09:39:46.575348 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="registry-server" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.575355 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="registry-server" Feb 19 09:39:46 crc kubenswrapper[4788]: E0219 09:39:46.575372 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="extract-content" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.575380 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="extract-content" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.575612 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca2aa2f1-d0e3-4d88-a360-0a94a5516b15" containerName="registry-server" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.576389 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.578566 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-sb7nx"/"default-dockercfg-cgwb8" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.704500 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnprg\" (UniqueName: \"kubernetes.io/projected/096c13b7-1831-4f1e-9a9d-88acd8a797a7-kube-api-access-fnprg\") pod \"crc-debug-pzf6g\" (UID: \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\") " pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.704863 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/096c13b7-1831-4f1e-9a9d-88acd8a797a7-host\") pod \"crc-debug-pzf6g\" (UID: \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\") " pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.807980 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnprg\" (UniqueName: \"kubernetes.io/projected/096c13b7-1831-4f1e-9a9d-88acd8a797a7-kube-api-access-fnprg\") pod \"crc-debug-pzf6g\" (UID: \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\") " pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.808056 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/096c13b7-1831-4f1e-9a9d-88acd8a797a7-host\") pod \"crc-debug-pzf6g\" (UID: \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\") " pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.808397 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/096c13b7-1831-4f1e-9a9d-88acd8a797a7-host\") pod \"crc-debug-pzf6g\" (UID: \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\") " pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.836317 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnprg\" (UniqueName: \"kubernetes.io/projected/096c13b7-1831-4f1e-9a9d-88acd8a797a7-kube-api-access-fnprg\") pod \"crc-debug-pzf6g\" (UID: \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\") " pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:39:46 crc kubenswrapper[4788]: I0219 09:39:46.893235 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:39:47 crc kubenswrapper[4788]: I0219 09:39:47.203894 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" event={"ID":"096c13b7-1831-4f1e-9a9d-88acd8a797a7","Type":"ContainerStarted","Data":"ec80fd6eefb780d44963602d72095317072b0601f81e84bbbe3410b007ac36c5"} Feb 19 09:39:59 crc kubenswrapper[4788]: I0219 09:39:59.339190 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" event={"ID":"096c13b7-1831-4f1e-9a9d-88acd8a797a7","Type":"ContainerStarted","Data":"794a76811f24c00792eb8523a435428af3528f1ed149f7ea34cef3d004af60bb"} Feb 19 09:39:59 crc kubenswrapper[4788]: I0219 09:39:59.352510 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" podStartSLOduration=1.3757061689999999 podStartE2EDuration="13.352495224s" podCreationTimestamp="2026-02-19 09:39:46 +0000 UTC" firstStartedPulling="2026-02-19 09:39:46.936300119 +0000 UTC m=+3288.924311591" lastFinishedPulling="2026-02-19 09:39:58.913089174 +0000 UTC m=+3300.901100646" observedRunningTime="2026-02-19 09:39:59.349381693 +0000 UTC m=+3301.337393165" watchObservedRunningTime="2026-02-19 09:39:59.352495224 +0000 UTC m=+3301.340506696" Feb 19 09:40:42 crc kubenswrapper[4788]: I0219 09:40:42.744451 4788 generic.go:334] "Generic (PLEG): container finished" podID="096c13b7-1831-4f1e-9a9d-88acd8a797a7" containerID="794a76811f24c00792eb8523a435428af3528f1ed149f7ea34cef3d004af60bb" exitCode=0 Feb 19 09:40:42 crc kubenswrapper[4788]: I0219 09:40:42.744504 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" event={"ID":"096c13b7-1831-4f1e-9a9d-88acd8a797a7","Type":"ContainerDied","Data":"794a76811f24c00792eb8523a435428af3528f1ed149f7ea34cef3d004af60bb"} Feb 19 09:40:43 crc kubenswrapper[4788]: I0219 09:40:43.866560 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:40:43 crc kubenswrapper[4788]: I0219 09:40:43.927461 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/096c13b7-1831-4f1e-9a9d-88acd8a797a7-host\") pod \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\" (UID: \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\") " Feb 19 09:40:43 crc kubenswrapper[4788]: I0219 09:40:43.927528 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnprg\" (UniqueName: \"kubernetes.io/projected/096c13b7-1831-4f1e-9a9d-88acd8a797a7-kube-api-access-fnprg\") pod \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\" (UID: \"096c13b7-1831-4f1e-9a9d-88acd8a797a7\") " Feb 19 09:40:43 crc kubenswrapper[4788]: I0219 09:40:43.927621 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/096c13b7-1831-4f1e-9a9d-88acd8a797a7-host" (OuterVolumeSpecName: "host") pod "096c13b7-1831-4f1e-9a9d-88acd8a797a7" (UID: "096c13b7-1831-4f1e-9a9d-88acd8a797a7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:40:43 crc kubenswrapper[4788]: I0219 09:40:43.928047 4788 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/096c13b7-1831-4f1e-9a9d-88acd8a797a7-host\") on node \"crc\" DevicePath \"\"" Feb 19 09:40:43 crc kubenswrapper[4788]: I0219 09:40:43.929644 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-pzf6g"] Feb 19 09:40:43 crc kubenswrapper[4788]: I0219 09:40:43.935960 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/096c13b7-1831-4f1e-9a9d-88acd8a797a7-kube-api-access-fnprg" (OuterVolumeSpecName: "kube-api-access-fnprg") pod "096c13b7-1831-4f1e-9a9d-88acd8a797a7" (UID: "096c13b7-1831-4f1e-9a9d-88acd8a797a7"). InnerVolumeSpecName "kube-api-access-fnprg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:40:43 crc kubenswrapper[4788]: I0219 09:40:43.937669 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-pzf6g"] Feb 19 09:40:44 crc kubenswrapper[4788]: I0219 09:40:44.030729 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnprg\" (UniqueName: \"kubernetes.io/projected/096c13b7-1831-4f1e-9a9d-88acd8a797a7-kube-api-access-fnprg\") on node \"crc\" DevicePath \"\"" Feb 19 09:40:44 crc kubenswrapper[4788]: I0219 09:40:44.735042 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="096c13b7-1831-4f1e-9a9d-88acd8a797a7" path="/var/lib/kubelet/pods/096c13b7-1831-4f1e-9a9d-88acd8a797a7/volumes" Feb 19 09:40:44 crc kubenswrapper[4788]: I0219 09:40:44.769475 4788 scope.go:117] "RemoveContainer" containerID="794a76811f24c00792eb8523a435428af3528f1ed149f7ea34cef3d004af60bb" Feb 19 09:40:44 crc kubenswrapper[4788]: I0219 09:40:44.769550 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-pzf6g" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.135382 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-nvszt"] Feb 19 09:40:45 crc kubenswrapper[4788]: E0219 09:40:45.135840 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096c13b7-1831-4f1e-9a9d-88acd8a797a7" containerName="container-00" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.135856 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="096c13b7-1831-4f1e-9a9d-88acd8a797a7" containerName="container-00" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.136195 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="096c13b7-1831-4f1e-9a9d-88acd8a797a7" containerName="container-00" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.136915 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.142825 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-sb7nx"/"default-dockercfg-cgwb8" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.154322 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71ebadeb-2d97-43b0-9abd-d86952d12f6b-host\") pod \"crc-debug-nvszt\" (UID: \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\") " pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.154414 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62x2v\" (UniqueName: \"kubernetes.io/projected/71ebadeb-2d97-43b0-9abd-d86952d12f6b-kube-api-access-62x2v\") pod \"crc-debug-nvszt\" (UID: \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\") " pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.257005 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71ebadeb-2d97-43b0-9abd-d86952d12f6b-host\") pod \"crc-debug-nvszt\" (UID: \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\") " pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.257148 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62x2v\" (UniqueName: \"kubernetes.io/projected/71ebadeb-2d97-43b0-9abd-d86952d12f6b-kube-api-access-62x2v\") pod \"crc-debug-nvszt\" (UID: \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\") " pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.257222 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71ebadeb-2d97-43b0-9abd-d86952d12f6b-host\") pod \"crc-debug-nvszt\" (UID: \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\") " pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.278693 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62x2v\" (UniqueName: \"kubernetes.io/projected/71ebadeb-2d97-43b0-9abd-d86952d12f6b-kube-api-access-62x2v\") pod \"crc-debug-nvszt\" (UID: \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\") " pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.459613 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.785908 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/crc-debug-nvszt" event={"ID":"71ebadeb-2d97-43b0-9abd-d86952d12f6b","Type":"ContainerStarted","Data":"dcde4bbf095dd84d4d54788aa130518108921d8066b2338e8d3fe7421719e0c5"} Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.786200 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/crc-debug-nvszt" event={"ID":"71ebadeb-2d97-43b0-9abd-d86952d12f6b","Type":"ContainerStarted","Data":"a799966991c0b4c29ee4b466000e4214452885c4bd95b9985c9663688ce304c4"} Feb 19 09:40:45 crc kubenswrapper[4788]: I0219 09:40:45.803901 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sb7nx/crc-debug-nvszt" podStartSLOduration=0.803876983 podStartE2EDuration="803.876983ms" podCreationTimestamp="2026-02-19 09:40:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:40:45.797913865 +0000 UTC m=+3347.785925357" watchObservedRunningTime="2026-02-19 09:40:45.803876983 +0000 UTC m=+3347.791888465" Feb 19 09:40:46 crc kubenswrapper[4788]: I0219 09:40:46.820261 4788 generic.go:334] "Generic (PLEG): container finished" podID="71ebadeb-2d97-43b0-9abd-d86952d12f6b" containerID="dcde4bbf095dd84d4d54788aa130518108921d8066b2338e8d3fe7421719e0c5" exitCode=0 Feb 19 09:40:46 crc kubenswrapper[4788]: I0219 09:40:46.820352 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/crc-debug-nvszt" event={"ID":"71ebadeb-2d97-43b0-9abd-d86952d12f6b","Type":"ContainerDied","Data":"dcde4bbf095dd84d4d54788aa130518108921d8066b2338e8d3fe7421719e0c5"} Feb 19 09:40:47 crc kubenswrapper[4788]: I0219 09:40:47.933105 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:47 crc kubenswrapper[4788]: I0219 09:40:47.992496 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-nvszt"] Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.001511 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-nvszt"] Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.006963 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62x2v\" (UniqueName: \"kubernetes.io/projected/71ebadeb-2d97-43b0-9abd-d86952d12f6b-kube-api-access-62x2v\") pod \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\" (UID: \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\") " Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.007106 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71ebadeb-2d97-43b0-9abd-d86952d12f6b-host\") pod \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\" (UID: \"71ebadeb-2d97-43b0-9abd-d86952d12f6b\") " Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.007210 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71ebadeb-2d97-43b0-9abd-d86952d12f6b-host" (OuterVolumeSpecName: "host") pod "71ebadeb-2d97-43b0-9abd-d86952d12f6b" (UID: "71ebadeb-2d97-43b0-9abd-d86952d12f6b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.007702 4788 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/71ebadeb-2d97-43b0-9abd-d86952d12f6b-host\") on node \"crc\" DevicePath \"\"" Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.016576 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ebadeb-2d97-43b0-9abd-d86952d12f6b-kube-api-access-62x2v" (OuterVolumeSpecName: "kube-api-access-62x2v") pod "71ebadeb-2d97-43b0-9abd-d86952d12f6b" (UID: "71ebadeb-2d97-43b0-9abd-d86952d12f6b"). InnerVolumeSpecName "kube-api-access-62x2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.113667 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62x2v\" (UniqueName: \"kubernetes.io/projected/71ebadeb-2d97-43b0-9abd-d86952d12f6b-kube-api-access-62x2v\") on node \"crc\" DevicePath \"\"" Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.748735 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ebadeb-2d97-43b0-9abd-d86952d12f6b" path="/var/lib/kubelet/pods/71ebadeb-2d97-43b0-9abd-d86952d12f6b/volumes" Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.839076 4788 scope.go:117] "RemoveContainer" containerID="dcde4bbf095dd84d4d54788aa130518108921d8066b2338e8d3fe7421719e0c5" Feb 19 09:40:48 crc kubenswrapper[4788]: I0219 09:40:48.839154 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-nvszt" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.232849 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-hm4k8"] Feb 19 09:40:49 crc kubenswrapper[4788]: E0219 09:40:49.234120 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ebadeb-2d97-43b0-9abd-d86952d12f6b" containerName="container-00" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.234154 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ebadeb-2d97-43b0-9abd-d86952d12f6b" containerName="container-00" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.236355 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ebadeb-2d97-43b0-9abd-d86952d12f6b" containerName="container-00" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.237131 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.238895 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-sb7nx"/"default-dockercfg-cgwb8" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.333921 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6749c176-feca-447e-bab3-2c754139d689-host\") pod \"crc-debug-hm4k8\" (UID: \"6749c176-feca-447e-bab3-2c754139d689\") " pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.334003 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcnc4\" (UniqueName: \"kubernetes.io/projected/6749c176-feca-447e-bab3-2c754139d689-kube-api-access-zcnc4\") pod \"crc-debug-hm4k8\" (UID: \"6749c176-feca-447e-bab3-2c754139d689\") " pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.436072 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6749c176-feca-447e-bab3-2c754139d689-host\") pod \"crc-debug-hm4k8\" (UID: \"6749c176-feca-447e-bab3-2c754139d689\") " pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.436314 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6749c176-feca-447e-bab3-2c754139d689-host\") pod \"crc-debug-hm4k8\" (UID: \"6749c176-feca-447e-bab3-2c754139d689\") " pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.436655 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcnc4\" (UniqueName: \"kubernetes.io/projected/6749c176-feca-447e-bab3-2c754139d689-kube-api-access-zcnc4\") pod \"crc-debug-hm4k8\" (UID: \"6749c176-feca-447e-bab3-2c754139d689\") " pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.471181 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcnc4\" (UniqueName: \"kubernetes.io/projected/6749c176-feca-447e-bab3-2c754139d689-kube-api-access-zcnc4\") pod \"crc-debug-hm4k8\" (UID: \"6749c176-feca-447e-bab3-2c754139d689\") " pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.564817 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:49 crc kubenswrapper[4788]: W0219 09:40:49.613678 4788 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6749c176_feca_447e_bab3_2c754139d689.slice/crio-71bc61bbca0c1a8dc8c262bf845268409b2a1fa675bc99fc9264458b71752421 WatchSource:0}: Error finding container 71bc61bbca0c1a8dc8c262bf845268409b2a1fa675bc99fc9264458b71752421: Status 404 returned error can't find the container with id 71bc61bbca0c1a8dc8c262bf845268409b2a1fa675bc99fc9264458b71752421 Feb 19 09:40:49 crc kubenswrapper[4788]: I0219 09:40:49.852610 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" event={"ID":"6749c176-feca-447e-bab3-2c754139d689","Type":"ContainerStarted","Data":"71bc61bbca0c1a8dc8c262bf845268409b2a1fa675bc99fc9264458b71752421"} Feb 19 09:40:50 crc kubenswrapper[4788]: I0219 09:40:50.867942 4788 generic.go:334] "Generic (PLEG): container finished" podID="6749c176-feca-447e-bab3-2c754139d689" containerID="7383004fac591c28f409f8103a55ff7c2df1b26afcde356d32b15ca290f987b8" exitCode=0 Feb 19 09:40:50 crc kubenswrapper[4788]: I0219 09:40:50.868016 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" event={"ID":"6749c176-feca-447e-bab3-2c754139d689","Type":"ContainerDied","Data":"7383004fac591c28f409f8103a55ff7c2df1b26afcde356d32b15ca290f987b8"} Feb 19 09:40:50 crc kubenswrapper[4788]: I0219 09:40:50.904352 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-hm4k8"] Feb 19 09:40:50 crc kubenswrapper[4788]: I0219 09:40:50.912804 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sb7nx/crc-debug-hm4k8"] Feb 19 09:40:51 crc kubenswrapper[4788]: I0219 09:40:51.961686 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:40:51 crc kubenswrapper[4788]: I0219 09:40:51.985520 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6749c176-feca-447e-bab3-2c754139d689-host\") pod \"6749c176-feca-447e-bab3-2c754139d689\" (UID: \"6749c176-feca-447e-bab3-2c754139d689\") " Feb 19 09:40:51 crc kubenswrapper[4788]: I0219 09:40:51.985635 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6749c176-feca-447e-bab3-2c754139d689-host" (OuterVolumeSpecName: "host") pod "6749c176-feca-447e-bab3-2c754139d689" (UID: "6749c176-feca-447e-bab3-2c754139d689"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 19 09:40:51 crc kubenswrapper[4788]: I0219 09:40:51.985695 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcnc4\" (UniqueName: \"kubernetes.io/projected/6749c176-feca-447e-bab3-2c754139d689-kube-api-access-zcnc4\") pod \"6749c176-feca-447e-bab3-2c754139d689\" (UID: \"6749c176-feca-447e-bab3-2c754139d689\") " Feb 19 09:40:51 crc kubenswrapper[4788]: I0219 09:40:51.986156 4788 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6749c176-feca-447e-bab3-2c754139d689-host\") on node \"crc\" DevicePath \"\"" Feb 19 09:40:52 crc kubenswrapper[4788]: I0219 09:40:52.000438 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6749c176-feca-447e-bab3-2c754139d689-kube-api-access-zcnc4" (OuterVolumeSpecName: "kube-api-access-zcnc4") pod "6749c176-feca-447e-bab3-2c754139d689" (UID: "6749c176-feca-447e-bab3-2c754139d689"). InnerVolumeSpecName "kube-api-access-zcnc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:40:52 crc kubenswrapper[4788]: I0219 09:40:52.087601 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcnc4\" (UniqueName: \"kubernetes.io/projected/6749c176-feca-447e-bab3-2c754139d689-kube-api-access-zcnc4\") on node \"crc\" DevicePath \"\"" Feb 19 09:40:52 crc kubenswrapper[4788]: I0219 09:40:52.138774 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:40:52 crc kubenswrapper[4788]: I0219 09:40:52.138834 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:40:52 crc kubenswrapper[4788]: I0219 09:40:52.725948 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6749c176-feca-447e-bab3-2c754139d689" path="/var/lib/kubelet/pods/6749c176-feca-447e-bab3-2c754139d689/volumes" Feb 19 09:40:52 crc kubenswrapper[4788]: I0219 09:40:52.884428 4788 scope.go:117] "RemoveContainer" containerID="7383004fac591c28f409f8103a55ff7c2df1b26afcde356d32b15ca290f987b8" Feb 19 09:40:52 crc kubenswrapper[4788]: I0219 09:40:52.884466 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/crc-debug-hm4k8" Feb 19 09:41:08 crc kubenswrapper[4788]: I0219 09:41:08.306191 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56db6bc974-wjvtx_817ea3c2-ff5a-475b-a6cd-295c84c9d02c/barbican-api/0.log" Feb 19 09:41:08 crc kubenswrapper[4788]: I0219 09:41:08.472866 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56db6bc974-wjvtx_817ea3c2-ff5a-475b-a6cd-295c84c9d02c/barbican-api-log/0.log" Feb 19 09:41:08 crc kubenswrapper[4788]: I0219 09:41:08.788427 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78686fb9d-c7vd2_0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5/barbican-keystone-listener/0.log" Feb 19 09:41:08 crc kubenswrapper[4788]: I0219 09:41:08.922042 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5565784c67-nzhww_c0a88dc2-18b9-4d55-9930-0c3396063e8b/barbican-worker/0.log" Feb 19 09:41:08 crc kubenswrapper[4788]: I0219 09:41:08.923597 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78686fb9d-c7vd2_0b5ecaf8-40b9-4c0b-a300-7db14ccf2be5/barbican-keystone-listener-log/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.008642 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5565784c67-nzhww_c0a88dc2-18b9-4d55-9930-0c3396063e8b/barbican-worker-log/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.147284 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-pnx49_ac977ac7-d7dd-4af4-a079-dbcadde95e32/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.248483 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c273f46-7fd9-4269-98a8-1df269a9a915/ceilometer-central-agent/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.311309 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c273f46-7fd9-4269-98a8-1df269a9a915/ceilometer-notification-agent/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.379387 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c273f46-7fd9-4269-98a8-1df269a9a915/proxy-httpd/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.427791 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c273f46-7fd9-4269-98a8-1df269a9a915/sg-core/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.546352 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e4897501-a017-443d-ac1c-08a9e23629b5/cinder-api/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.610529 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e4897501-a017-443d-ac1c-08a9e23629b5/cinder-api-log/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.720807 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fe790dae-c901-4447-9716-8b3e366c08a0/cinder-scheduler/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.756384 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fe790dae-c901-4447-9716-8b3e366c08a0/probe/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.941610 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hlsk5_e16c6a7e-f78e-45bd-9b1f-60d61f04e91f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:09 crc kubenswrapper[4788]: I0219 09:41:09.987421 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-rl87p_fd3576f8-7d3f-464b-9bcd-b07fa37ef51e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:10 crc kubenswrapper[4788]: I0219 09:41:10.125296 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-67csm_4830cf68-1ea9-4b7f-899b-5a5935bc2230/init/0.log" Feb 19 09:41:10 crc kubenswrapper[4788]: I0219 09:41:10.378355 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-67csm_4830cf68-1ea9-4b7f-899b-5a5935bc2230/init/0.log" Feb 19 09:41:10 crc kubenswrapper[4788]: I0219 09:41:10.419515 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-67csm_4830cf68-1ea9-4b7f-899b-5a5935bc2230/dnsmasq-dns/0.log" Feb 19 09:41:10 crc kubenswrapper[4788]: I0219 09:41:10.447717 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-5w5mk_952e8fd1-9634-4af1-8f56-62068214b66c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:10 crc kubenswrapper[4788]: I0219 09:41:10.611400 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_850e55a0-6179-423a-8698-ae1f87b8c049/glance-httpd/0.log" Feb 19 09:41:10 crc kubenswrapper[4788]: I0219 09:41:10.669868 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_850e55a0-6179-423a-8698-ae1f87b8c049/glance-log/0.log" Feb 19 09:41:10 crc kubenswrapper[4788]: I0219 09:41:10.841065 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b4abac97-c381-46dc-8451-35c8db80c9bd/glance-log/0.log" Feb 19 09:41:10 crc kubenswrapper[4788]: I0219 09:41:10.842523 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b4abac97-c381-46dc-8451-35c8db80c9bd/glance-httpd/0.log" Feb 19 09:41:11 crc kubenswrapper[4788]: I0219 09:41:11.492484 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6f874b587-dc7jr_d32302e3-6d30-4f9e-b993-e5fbaae1b9eb/heat-cfnapi/0.log" Feb 19 09:41:11 crc kubenswrapper[4788]: I0219 09:41:11.521436 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-69f6799bd7-ht4q2_02915100-34f3-4f6d-945c-1417a4bd06f7/heat-engine/0.log" Feb 19 09:41:11 crc kubenswrapper[4788]: I0219 09:41:11.540045 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-746b766c9d-9j8dk_b7b1972f-d2de-4154-a5fd-1b0adb9952a8/heat-api/0.log" Feb 19 09:41:11 crc kubenswrapper[4788]: I0219 09:41:11.692933 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-2vvms_cbc4fcdf-b1bf-4419-9b5d-77e3f831e02e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:11 crc kubenswrapper[4788]: I0219 09:41:11.737427 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-chr7d_ffef0c7e-2933-4173-ac71-b61fa297cad9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:12 crc kubenswrapper[4788]: I0219 09:41:12.094329 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_10e25de6-536d-4640-b29a-702c7d4ca706/kube-state-metrics/0.log" Feb 19 09:41:12 crc kubenswrapper[4788]: I0219 09:41:12.182699 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-748f6c7c59-q59qb_fa3fa772-2fba-4d00-993a-f240d053d0a9/keystone-api/0.log" Feb 19 09:41:12 crc kubenswrapper[4788]: I0219 09:41:12.237272 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-52lpp_b6e56cc4-a944-431d-988d-a29bb84b7d04/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:12 crc kubenswrapper[4788]: I0219 09:41:12.515872 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-647d45fc9-99jnj_27632492-f51f-49c6-a63a-d037329d57e9/neutron-api/0.log" Feb 19 09:41:12 crc kubenswrapper[4788]: I0219 09:41:12.570400 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-647d45fc9-99jnj_27632492-f51f-49c6-a63a-d037329d57e9/neutron-httpd/0.log" Feb 19 09:41:12 crc kubenswrapper[4788]: I0219 09:41:12.679458 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-brcmk_b5eefb80-777b-4277-88f1-ac900e3d1b1f/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:13 crc kubenswrapper[4788]: I0219 09:41:13.094905 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4c696528-c586-4ecc-8788-df43d6d03193/nova-cell0-conductor-conductor/0.log" Feb 19 09:41:13 crc kubenswrapper[4788]: I0219 09:41:13.187224 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2530b11c-c0ce-4ab3-9a0b-70060eb85184/nova-api-log/0.log" Feb 19 09:41:13 crc kubenswrapper[4788]: I0219 09:41:13.297874 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2530b11c-c0ce-4ab3-9a0b-70060eb85184/nova-api-api/0.log" Feb 19 09:41:13 crc kubenswrapper[4788]: I0219 09:41:13.329547 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e49d31ab-3c5e-4389-8cf4-798995b5880f/nova-cell1-conductor-conductor/0.log" Feb 19 09:41:13 crc kubenswrapper[4788]: I0219 09:41:13.466875 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f812a94e-1c79-42f5-8caa-34cd8352999c/nova-cell1-novncproxy-novncproxy/0.log" Feb 19 09:41:13 crc kubenswrapper[4788]: I0219 09:41:13.622528 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-l5jz4_015b4a74-0341-4e84-862c-d627e79f1318/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:13 crc kubenswrapper[4788]: I0219 09:41:13.753130 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fb5bf2a2-d945-4fab-a232-ee95c75d94d0/nova-metadata-log/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.034499 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_776314c0-8a5e-4224-8337-d2ae060a7ecd/mysql-bootstrap/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.035768 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a6034eb4-975c-485a-b636-25fa666dd148/nova-scheduler-scheduler/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.247791 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_776314c0-8a5e-4224-8337-d2ae060a7ecd/mysql-bootstrap/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.253919 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_776314c0-8a5e-4224-8337-d2ae060a7ecd/galera/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.418684 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e1e578ea-f70e-4aed-910b-4c1c0ddb3c39/mysql-bootstrap/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.635133 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e1e578ea-f70e-4aed-910b-4c1c0ddb3c39/mysql-bootstrap/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.650564 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e1e578ea-f70e-4aed-910b-4c1c0ddb3c39/galera/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.833822 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_dd204842-7289-418d-a4d1-e0d079e368b3/openstackclient/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.849095 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_fb5bf2a2-d945-4fab-a232-ee95c75d94d0/nova-metadata-metadata/0.log" Feb 19 09:41:14 crc kubenswrapper[4788]: I0219 09:41:14.890149 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vdj96_c8a9b729-65f6-40d1-94d6-149133edae05/openstack-network-exporter/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.084629 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-msb2b_be836fd0-7c7e-4824-b455-bb4ccec1163e/ovn-controller/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.171818 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-snwhx_f7e5355c-1a77-48de-998c-7d6e676d5eee/ovsdb-server-init/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.286003 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-snwhx_f7e5355c-1a77-48de-998c-7d6e676d5eee/ovsdb-server-init/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.291919 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-snwhx_f7e5355c-1a77-48de-998c-7d6e676d5eee/ovs-vswitchd/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.330430 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-snwhx_f7e5355c-1a77-48de-998c-7d6e676d5eee/ovsdb-server/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.487018 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-hj98t_0db811eb-116f-4653-94a4-467209ef8e49/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.546173 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c371e63d-7b67-4bcf-a08d-292f3743c388/openstack-network-exporter/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.609979 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c371e63d-7b67-4bcf-a08d-292f3743c388/ovn-northd/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.857940 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8a6d4e8-cfb3-4949-9901-ca31478fc108/ovsdbserver-nb/0.log" Feb 19 09:41:15 crc kubenswrapper[4788]: I0219 09:41:15.926592 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8a6d4e8-cfb3-4949-9901-ca31478fc108/openstack-network-exporter/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.129506 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ad13ca9c-744c-4c12-9911-7e84100a1bda/ovsdbserver-sb/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.149790 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ad13ca9c-744c-4c12-9911-7e84100a1bda/openstack-network-exporter/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.302298 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-86464574f6-lv4mn_9b9de162-c3e4-4ab7-a29d-4dabf60de673/placement-api/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.385095 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_01ace23c-c0e0-4390-85fc-1b50f8d72a66/setup-container/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.420073 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-86464574f6-lv4mn_9b9de162-c3e4-4ab7-a29d-4dabf60de673/placement-log/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.583681 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_01ace23c-c0e0-4390-85fc-1b50f8d72a66/rabbitmq/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.612538 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_01ace23c-c0e0-4390-85fc-1b50f8d72a66/setup-container/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.679108 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_77af71c0-581a-4e58-9429-bb14901b1a1d/setup-container/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.809176 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_77af71c0-581a-4e58-9429-bb14901b1a1d/setup-container/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.862683 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_77af71c0-581a-4e58-9429-bb14901b1a1d/rabbitmq/0.log" Feb 19 09:41:16 crc kubenswrapper[4788]: I0219 09:41:16.890235 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-wwlfg_a925d651-e8e3-4436-b6b8-4894a550431f/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:17 crc kubenswrapper[4788]: I0219 09:41:17.148514 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-sqp6l_e9459589-7001-4a8f-ac0c-6dae0ce143bf/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:17 crc kubenswrapper[4788]: I0219 09:41:17.151022 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-lhgwn_4a6886e3-bd0f-4551-ac66-421e052315f1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:17 crc kubenswrapper[4788]: I0219 09:41:17.330195 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fbj5p_5900ffba-e746-46d2-bb1a-07ea800e9ff5/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:17 crc kubenswrapper[4788]: I0219 09:41:17.410840 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cx4ht_2317a010-4151-4577-804c-70f4e7fb2775/ssh-known-hosts-edpm-deployment/0.log" Feb 19 09:41:17 crc kubenswrapper[4788]: I0219 09:41:17.680857 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-58576d599f-zjnz8_6afa4604-515a-4774-9d5f-e641cb256988/proxy-server/0.log" Feb 19 09:41:17 crc kubenswrapper[4788]: I0219 09:41:17.741573 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-58576d599f-zjnz8_6afa4604-515a-4774-9d5f-e641cb256988/proxy-httpd/0.log" Feb 19 09:41:17 crc kubenswrapper[4788]: I0219 09:41:17.836156 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-tc7kz_7f8ec3fb-0caa-413e-b4e5-74bff2be5e4a/swift-ring-rebalance/0.log" Feb 19 09:41:17 crc kubenswrapper[4788]: I0219 09:41:17.970887 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/account-auditor/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.004076 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/account-reaper/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.073008 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/account-replicator/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.177951 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/container-auditor/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.242289 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/container-replicator/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.271756 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/account-server/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.315489 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/container-server/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.391662 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/container-updater/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.460273 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/object-auditor/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.489314 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/object-expirer/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.585913 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/object-replicator/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.592261 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/object-server/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.651551 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/object-updater/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.707442 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/rsync/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.840611 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_f91d0357-4651-41b6-a842-27d8c7f47e60/swift-recon-cron/0.log" Feb 19 09:41:18 crc kubenswrapper[4788]: I0219 09:41:18.950548 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xgkcr_030858ba-0181-4da7-afa6-ec5fb6cefc0f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:19 crc kubenswrapper[4788]: I0219 09:41:19.104443 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_59d3eda3-8975-46b8-8cfa-27b4dcd210f7/tempest-tests-tempest-tests-runner/0.log" Feb 19 09:41:19 crc kubenswrapper[4788]: I0219 09:41:19.286926 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_03eee958-9fa0-4e8d-8f47-a40b7fab0b78/test-operator-logs-container/0.log" Feb 19 09:41:19 crc kubenswrapper[4788]: I0219 09:41:19.461592 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-dkbnx_c285f791-b6be-44ce-8534-6196c12656ad/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 19 09:41:22 crc kubenswrapper[4788]: I0219 09:41:22.139027 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:41:22 crc kubenswrapper[4788]: I0219 09:41:22.139280 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:41:26 crc kubenswrapper[4788]: I0219 09:41:26.468946 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_74df8b3a-d0c6-4cb3-b514-a38198179c59/memcached/0.log" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.507877 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r58w6"] Feb 19 09:41:29 crc kubenswrapper[4788]: E0219 09:41:29.508600 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6749c176-feca-447e-bab3-2c754139d689" containerName="container-00" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.508612 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="6749c176-feca-447e-bab3-2c754139d689" containerName="container-00" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.508818 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="6749c176-feca-447e-bab3-2c754139d689" containerName="container-00" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.510033 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.518476 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r58w6"] Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.642202 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-catalog-content\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.642279 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-utilities\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.642311 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q279q\" (UniqueName: \"kubernetes.io/projected/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-kube-api-access-q279q\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.745159 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-catalog-content\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.745234 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-utilities\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.745326 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q279q\" (UniqueName: \"kubernetes.io/projected/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-kube-api-access-q279q\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.746171 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-catalog-content\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.746313 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-utilities\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.772496 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q279q\" (UniqueName: \"kubernetes.io/projected/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-kube-api-access-q279q\") pod \"community-operators-r58w6\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:29 crc kubenswrapper[4788]: I0219 09:41:29.861123 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:30 crc kubenswrapper[4788]: I0219 09:41:30.388541 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r58w6"] Feb 19 09:41:31 crc kubenswrapper[4788]: I0219 09:41:31.193508 4788 generic.go:334] "Generic (PLEG): container finished" podID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerID="43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167" exitCode=0 Feb 19 09:41:31 crc kubenswrapper[4788]: I0219 09:41:31.193568 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r58w6" event={"ID":"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed","Type":"ContainerDied","Data":"43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167"} Feb 19 09:41:31 crc kubenswrapper[4788]: I0219 09:41:31.193789 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r58w6" event={"ID":"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed","Type":"ContainerStarted","Data":"c0b8ec7a71207b00932c2884d6ab761064f974f8c8916ab0eb7a68d0009ad822"} Feb 19 09:41:32 crc kubenswrapper[4788]: I0219 09:41:32.204268 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r58w6" event={"ID":"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed","Type":"ContainerStarted","Data":"27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3"} Feb 19 09:41:34 crc kubenswrapper[4788]: I0219 09:41:34.224415 4788 generic.go:334] "Generic (PLEG): container finished" podID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerID="27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3" exitCode=0 Feb 19 09:41:34 crc kubenswrapper[4788]: I0219 09:41:34.224701 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r58w6" event={"ID":"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed","Type":"ContainerDied","Data":"27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3"} Feb 19 09:41:35 crc kubenswrapper[4788]: I0219 09:41:35.255223 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r58w6" event={"ID":"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed","Type":"ContainerStarted","Data":"cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f"} Feb 19 09:41:35 crc kubenswrapper[4788]: I0219 09:41:35.286437 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r58w6" podStartSLOduration=2.85138355 podStartE2EDuration="6.286418843s" podCreationTimestamp="2026-02-19 09:41:29 +0000 UTC" firstStartedPulling="2026-02-19 09:41:31.195261585 +0000 UTC m=+3393.183273057" lastFinishedPulling="2026-02-19 09:41:34.630296878 +0000 UTC m=+3396.618308350" observedRunningTime="2026-02-19 09:41:35.277121637 +0000 UTC m=+3397.265133109" watchObservedRunningTime="2026-02-19 09:41:35.286418843 +0000 UTC m=+3397.274430315" Feb 19 09:41:39 crc kubenswrapper[4788]: I0219 09:41:39.861690 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:39 crc kubenswrapper[4788]: I0219 09:41:39.862132 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:39 crc kubenswrapper[4788]: I0219 09:41:39.930325 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:40 crc kubenswrapper[4788]: I0219 09:41:40.364947 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:40 crc kubenswrapper[4788]: I0219 09:41:40.431922 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r58w6"] Feb 19 09:41:42 crc kubenswrapper[4788]: I0219 09:41:42.315433 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r58w6" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerName="registry-server" containerID="cri-o://cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f" gracePeriod=2 Feb 19 09:41:42 crc kubenswrapper[4788]: I0219 09:41:42.834020 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:42 crc kubenswrapper[4788]: I0219 09:41:42.917972 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q279q\" (UniqueName: \"kubernetes.io/projected/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-kube-api-access-q279q\") pod \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " Feb 19 09:41:42 crc kubenswrapper[4788]: I0219 09:41:42.918077 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-catalog-content\") pod \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " Feb 19 09:41:42 crc kubenswrapper[4788]: I0219 09:41:42.918348 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-utilities\") pod \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\" (UID: \"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed\") " Feb 19 09:41:42 crc kubenswrapper[4788]: I0219 09:41:42.919803 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-utilities" (OuterVolumeSpecName: "utilities") pod "0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" (UID: "0c5def20-749c-4dae-9e1f-8d7b6e8d25ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:41:42 crc kubenswrapper[4788]: I0219 09:41:42.930201 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-kube-api-access-q279q" (OuterVolumeSpecName: "kube-api-access-q279q") pod "0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" (UID: "0c5def20-749c-4dae-9e1f-8d7b6e8d25ed"). InnerVolumeSpecName "kube-api-access-q279q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:41:42 crc kubenswrapper[4788]: I0219 09:41:42.979811 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" (UID: "0c5def20-749c-4dae-9e1f-8d7b6e8d25ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.020279 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.020310 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q279q\" (UniqueName: \"kubernetes.io/projected/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-kube-api-access-q279q\") on node \"crc\" DevicePath \"\"" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.020320 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.324575 4788 generic.go:334] "Generic (PLEG): container finished" podID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerID="cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f" exitCode=0 Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.324622 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r58w6" event={"ID":"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed","Type":"ContainerDied","Data":"cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f"} Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.324661 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r58w6" event={"ID":"0c5def20-749c-4dae-9e1f-8d7b6e8d25ed","Type":"ContainerDied","Data":"c0b8ec7a71207b00932c2884d6ab761064f974f8c8916ab0eb7a68d0009ad822"} Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.324674 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r58w6" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.324685 4788 scope.go:117] "RemoveContainer" containerID="cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.365577 4788 scope.go:117] "RemoveContainer" containerID="27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.387967 4788 scope.go:117] "RemoveContainer" containerID="43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.393300 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r58w6"] Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.407231 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r58w6"] Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.427104 4788 scope.go:117] "RemoveContainer" containerID="cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f" Feb 19 09:41:43 crc kubenswrapper[4788]: E0219 09:41:43.427519 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f\": container with ID starting with cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f not found: ID does not exist" containerID="cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.427556 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f"} err="failed to get container status \"cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f\": rpc error: code = NotFound desc = could not find container \"cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f\": container with ID starting with cff2a4d6e58ea3fdabb372fe6040e1292a1e1ef67c42752ff0401f4104f4364f not found: ID does not exist" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.427583 4788 scope.go:117] "RemoveContainer" containerID="27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3" Feb 19 09:41:43 crc kubenswrapper[4788]: E0219 09:41:43.427998 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3\": container with ID starting with 27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3 not found: ID does not exist" containerID="27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.428023 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3"} err="failed to get container status \"27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3\": rpc error: code = NotFound desc = could not find container \"27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3\": container with ID starting with 27843d9c491a2c1a717b892c1399dda51556e31d59e2dfcee4fe4dc3f6cfe0b3 not found: ID does not exist" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.428038 4788 scope.go:117] "RemoveContainer" containerID="43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167" Feb 19 09:41:43 crc kubenswrapper[4788]: E0219 09:41:43.428428 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167\": container with ID starting with 43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167 not found: ID does not exist" containerID="43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167" Feb 19 09:41:43 crc kubenswrapper[4788]: I0219 09:41:43.428453 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167"} err="failed to get container status \"43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167\": rpc error: code = NotFound desc = could not find container \"43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167\": container with ID starting with 43a576d6184e2a2f8c63af27a1bfd05af2397445d7b47086f709e73b61332167 not found: ID does not exist" Feb 19 09:41:44 crc kubenswrapper[4788]: I0219 09:41:44.731683 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" path="/var/lib/kubelet/pods/0c5def20-749c-4dae-9e1f-8d7b6e8d25ed/volumes" Feb 19 09:41:47 crc kubenswrapper[4788]: I0219 09:41:47.206511 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc_5c4ff725-92d6-4b9a-88d1-bf3366ba1111/util/0.log" Feb 19 09:41:47 crc kubenswrapper[4788]: I0219 09:41:47.402550 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc_5c4ff725-92d6-4b9a-88d1-bf3366ba1111/pull/0.log" Feb 19 09:41:47 crc kubenswrapper[4788]: I0219 09:41:47.409043 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc_5c4ff725-92d6-4b9a-88d1-bf3366ba1111/util/0.log" Feb 19 09:41:47 crc kubenswrapper[4788]: I0219 09:41:47.417745 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc_5c4ff725-92d6-4b9a-88d1-bf3366ba1111/pull/0.log" Feb 19 09:41:47 crc kubenswrapper[4788]: I0219 09:41:47.609966 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc_5c4ff725-92d6-4b9a-88d1-bf3366ba1111/pull/0.log" Feb 19 09:41:47 crc kubenswrapper[4788]: I0219 09:41:47.610349 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc_5c4ff725-92d6-4b9a-88d1-bf3366ba1111/extract/0.log" Feb 19 09:41:47 crc kubenswrapper[4788]: I0219 09:41:47.618161 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_576fdfde700380335ff4e8c968e3071c017f9f650fc9e56ae574b3ca95d5lvc_5c4ff725-92d6-4b9a-88d1-bf3366ba1111/util/0.log" Feb 19 09:41:47 crc kubenswrapper[4788]: I0219 09:41:47.993167 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-bp9gh_0da97318-27f4-465d-91c6-c44004a9e291/manager/0.log" Feb 19 09:41:48 crc kubenswrapper[4788]: I0219 09:41:48.330605 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-74624_17fe7557-8cf3-4f24-86a3-993037455f15/manager/0.log" Feb 19 09:41:48 crc kubenswrapper[4788]: I0219 09:41:48.668094 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-57cc58f5d8-gxbgk_acc6ac6b-da33-4eb0-a2b9-33b6e45a118e/manager/0.log" Feb 19 09:41:48 crc kubenswrapper[4788]: I0219 09:41:48.756553 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-rq69q_8156be62-dae2-4105-9c97-7cdd398e1eb4/manager/0.log" Feb 19 09:41:49 crc kubenswrapper[4788]: I0219 09:41:49.173610 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-tn2mk_3e3f67ff-5285-401a-a19e-2476a8334248/manager/0.log" Feb 19 09:41:49 crc kubenswrapper[4788]: I0219 09:41:49.347803 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-ccpr8_bc6d475e-ace7-47ba-a9ba-cb493c7225c9/manager/0.log" Feb 19 09:41:49 crc kubenswrapper[4788]: I0219 09:41:49.349217 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-754h5_e87d14f5-ac68-489e-a79c-d9962b5786e9/manager/0.log" Feb 19 09:41:49 crc kubenswrapper[4788]: I0219 09:41:49.627453 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-6zwrs_cb9194bc-2a08-4c61-9302-14d3ab1b731a/manager/0.log" Feb 19 09:41:49 crc kubenswrapper[4788]: I0219 09:41:49.636390 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-q6x4r_53d52f5e-a729-4d03-b949-7ccb6719754c/manager/0.log" Feb 19 09:41:49 crc kubenswrapper[4788]: I0219 09:41:49.841945 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-r2rc5_6a57c46b-96e1-4c3a-aede-8b9ced264828/manager/0.log" Feb 19 09:41:50 crc kubenswrapper[4788]: I0219 09:41:50.241475 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-4j2v9_04be53c8-52b4-43fd-9cab-f1484fd17140/manager/0.log" Feb 19 09:41:50 crc kubenswrapper[4788]: I0219 09:41:50.333927 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-9sr69_226ea1a2-f858-46df-8a62-12f5c41da0c5/manager/0.log" Feb 19 09:41:50 crc kubenswrapper[4788]: I0219 09:41:50.802658 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cp8bcg_cad2b3a6-6577-4136-bf9e-213884d94b31/manager/0.log" Feb 19 09:41:51 crc kubenswrapper[4788]: I0219 09:41:51.660331 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5c66fdff94-lm8dp_5b19ebbb-b6af-4e2a-8af4-718efe39cd68/operator/0.log" Feb 19 09:41:51 crc kubenswrapper[4788]: I0219 09:41:51.820766 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-wnrq7_4b258a60-4663-4a78-9d1b-e82add2f9d42/registry-server/0.log" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.066808 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-ddzgn_c01b7431-8487-45ac-9b7b-c1ec5dc115f0/manager/0.log" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.138752 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.138797 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.138835 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.139587 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e559d6ada7592b7d324b68c3abdf68767d50a4eff6ea2aef554a1723fbf1a30"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.139640 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://3e559d6ada7592b7d324b68c3abdf68767d50a4eff6ea2aef554a1723fbf1a30" gracePeriod=600 Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.209141 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-dlfjs_91c888b5-7f35-4049-830e-855914654f90/manager/0.log" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.344169 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-fjt75_87f1ff04-454b-4c9d-82e6-5e7239c63978/manager/0.log" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.407077 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="3e559d6ada7592b7d324b68c3abdf68767d50a4eff6ea2aef554a1723fbf1a30" exitCode=0 Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.407122 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"3e559d6ada7592b7d324b68c3abdf68767d50a4eff6ea2aef554a1723fbf1a30"} Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.407236 4788 scope.go:117] "RemoveContainer" containerID="43ab5bace4def0877a0a4e77509d682eab48854be38ccc9803075a79d7f3fa69" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.444379 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-gb54d_cfd2e1e0-4b24-4e57-8a3d-779a03729f0e/operator/0.log" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.596877 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-ncsrp_777de642-ca99-4f43-b282-5c9703e97dfe/manager/0.log" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.851487 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-pbtll_52b85fb1-54ad-440a-9c9d-d9969f34f1c7/manager/0.log" Feb 19 09:41:52 crc kubenswrapper[4788]: I0219 09:41:52.866262 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-g7mw7_aa0d4ba2-0512-42e8-8c66-137bf969f706/manager/0.log" Feb 19 09:41:53 crc kubenswrapper[4788]: I0219 09:41:53.112716 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-6dvcl_7d01364a-9507-4d68-bb0b-efbb67fe2e48/manager/0.log" Feb 19 09:41:53 crc kubenswrapper[4788]: I0219 09:41:53.251943 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6774fbc4bc-5gddb_d337efba-1c27-47ac-bdd7-17c3848678cb/manager/0.log" Feb 19 09:41:53 crc kubenswrapper[4788]: I0219 09:41:53.416140 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8"} Feb 19 09:41:54 crc kubenswrapper[4788]: I0219 09:41:54.534162 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-8jh9z_c4a6e8c2-5708-45ec-8cd7-08d552abbe53/manager/0.log" Feb 19 09:42:14 crc kubenswrapper[4788]: I0219 09:42:14.124727 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ng2p7_37572f7a-2fcf-4d28-993d-cd924c0a78b8/control-plane-machine-set-operator/0.log" Feb 19 09:42:14 crc kubenswrapper[4788]: I0219 09:42:14.367332 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9s2zg_2ecb616f-62fc-4ff2-a353-6e08c63581a8/kube-rbac-proxy/0.log" Feb 19 09:42:14 crc kubenswrapper[4788]: I0219 09:42:14.564614 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-9s2zg_2ecb616f-62fc-4ff2-a353-6e08c63581a8/machine-api-operator/0.log" Feb 19 09:42:28 crc kubenswrapper[4788]: I0219 09:42:28.472664 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-2sfnb_14962c5c-80cd-4aa8-918b-902a3853e50c/cert-manager-controller/0.log" Feb 19 09:42:28 crc kubenswrapper[4788]: I0219 09:42:28.567446 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-qnl27_d755f7fa-b68d-421f-b4b5-c25a4ba5af59/cert-manager-cainjector/0.log" Feb 19 09:42:28 crc kubenswrapper[4788]: I0219 09:42:28.667047 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-t5hxr_75272a75-157a-4506-98cb-d6fe9ff79580/cert-manager-webhook/0.log" Feb 19 09:42:42 crc kubenswrapper[4788]: I0219 09:42:42.687140 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-4s4xp_cc4fe95c-3af5-4be2-bf60-2c5e02c18df9/nmstate-console-plugin/0.log" Feb 19 09:42:42 crc kubenswrapper[4788]: I0219 09:42:42.856664 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s4s7s_5874210a-7a83-4166-8486-0f827772fc30/nmstate-handler/0.log" Feb 19 09:42:43 crc kubenswrapper[4788]: I0219 09:42:43.035919 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qq85c_6bd96454-70db-4a47-808c-b377bbe1bd00/kube-rbac-proxy/0.log" Feb 19 09:42:43 crc kubenswrapper[4788]: I0219 09:42:43.085253 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-qq85c_6bd96454-70db-4a47-808c-b377bbe1bd00/nmstate-metrics/0.log" Feb 19 09:42:43 crc kubenswrapper[4788]: I0219 09:42:43.232718 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-qpczt_f57ff293-836e-4dbb-a634-1f74882dc23f/nmstate-operator/0.log" Feb 19 09:42:43 crc kubenswrapper[4788]: I0219 09:42:43.261735 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-tvjrb_316cd090-6e25-48f3-89d6-21f11c7aafd9/nmstate-webhook/0.log" Feb 19 09:43:12 crc kubenswrapper[4788]: I0219 09:43:12.495646 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-5z7c2_bd37a23b-542c-489a-90f2-ed7b82c59ec0/kube-rbac-proxy/0.log" Feb 19 09:43:12 crc kubenswrapper[4788]: I0219 09:43:12.659203 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-5z7c2_bd37a23b-542c-489a-90f2-ed7b82c59ec0/controller/0.log" Feb 19 09:43:12 crc kubenswrapper[4788]: I0219 09:43:12.715271 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-frr-files/0.log" Feb 19 09:43:12 crc kubenswrapper[4788]: I0219 09:43:12.876536 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-frr-files/0.log" Feb 19 09:43:12 crc kubenswrapper[4788]: I0219 09:43:12.882562 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-reloader/0.log" Feb 19 09:43:12 crc kubenswrapper[4788]: I0219 09:43:12.889162 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-metrics/0.log" Feb 19 09:43:12 crc kubenswrapper[4788]: I0219 09:43:12.911064 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-reloader/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.052373 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-frr-files/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.052525 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-metrics/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.061260 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-reloader/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.100130 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-metrics/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.262431 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-reloader/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.263913 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-frr-files/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.281323 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/cp-metrics/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.316196 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/controller/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.479869 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/kube-rbac-proxy/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.510670 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/kube-rbac-proxy-frr/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.511773 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/frr-metrics/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.723094 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/reloader/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.729420 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-25kjn_47634f49-df39-43a3-8c1a-850fa890d4dc/frr-k8s-webhook-server/0.log" Feb 19 09:43:13 crc kubenswrapper[4788]: I0219 09:43:13.964791 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-56888c5b56-9lw8k_fb7fcd0f-b451-422c-8984-1494da7aec38/manager/0.log" Feb 19 09:43:14 crc kubenswrapper[4788]: I0219 09:43:14.106405 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b56bc649d-sz2mk_608a07cc-88f0-405b-87de-f43cc5ee8989/webhook-server/0.log" Feb 19 09:43:14 crc kubenswrapper[4788]: I0219 09:43:14.194923 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-64rbq_d0c713d1-6916-421f-8876-757d3d7dfa45/kube-rbac-proxy/0.log" Feb 19 09:43:14 crc kubenswrapper[4788]: I0219 09:43:14.841399 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-h9lb6_479438a3-ab89-4f1f-a1b8-01ac3c012454/frr/0.log" Feb 19 09:43:14 crc kubenswrapper[4788]: I0219 09:43:14.843626 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-64rbq_d0c713d1-6916-421f-8876-757d3d7dfa45/speaker/0.log" Feb 19 09:43:27 crc kubenswrapper[4788]: I0219 09:43:27.744164 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62_c1d75f31-f259-4276-aadb-af4b0540b221/util/0.log" Feb 19 09:43:27 crc kubenswrapper[4788]: I0219 09:43:27.991804 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62_c1d75f31-f259-4276-aadb-af4b0540b221/pull/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.003840 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62_c1d75f31-f259-4276-aadb-af4b0540b221/util/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.004274 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62_c1d75f31-f259-4276-aadb-af4b0540b221/pull/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.137704 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62_c1d75f31-f259-4276-aadb-af4b0540b221/pull/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.138848 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62_c1d75f31-f259-4276-aadb-af4b0540b221/util/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.191320 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213slk62_c1d75f31-f259-4276-aadb-af4b0540b221/extract/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.283376 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b2kvp_98c991ec-1b0a-4263-b45a-92a841c291f0/extract-utilities/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.487821 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b2kvp_98c991ec-1b0a-4263-b45a-92a841c291f0/extract-utilities/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.488288 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b2kvp_98c991ec-1b0a-4263-b45a-92a841c291f0/extract-content/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.529681 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b2kvp_98c991ec-1b0a-4263-b45a-92a841c291f0/extract-content/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.675552 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b2kvp_98c991ec-1b0a-4263-b45a-92a841c291f0/extract-utilities/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.695074 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b2kvp_98c991ec-1b0a-4263-b45a-92a841c291f0/extract-content/0.log" Feb 19 09:43:28 crc kubenswrapper[4788]: I0219 09:43:28.890751 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxq4d_8ff25313-66ec-4853-bf80-45bec9ab0ccd/extract-utilities/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.092805 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxq4d_8ff25313-66ec-4853-bf80-45bec9ab0ccd/extract-content/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.155318 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxq4d_8ff25313-66ec-4853-bf80-45bec9ab0ccd/extract-content/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.202593 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxq4d_8ff25313-66ec-4853-bf80-45bec9ab0ccd/extract-utilities/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.228700 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-b2kvp_98c991ec-1b0a-4263-b45a-92a841c291f0/registry-server/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.377574 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxq4d_8ff25313-66ec-4853-bf80-45bec9ab0ccd/extract-utilities/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.378658 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxq4d_8ff25313-66ec-4853-bf80-45bec9ab0ccd/extract-content/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.595697 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs_5dff47f0-9373-498b-b03f-fe5106d271b7/util/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.828407 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs_5dff47f0-9373-498b-b03f-fe5106d271b7/util/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.842413 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxq4d_8ff25313-66ec-4853-bf80-45bec9ab0ccd/registry-server/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.860470 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs_5dff47f0-9373-498b-b03f-fe5106d271b7/pull/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.872816 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs_5dff47f0-9373-498b-b03f-fe5106d271b7/pull/0.log" Feb 19 09:43:29 crc kubenswrapper[4788]: I0219 09:43:29.992540 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs_5dff47f0-9373-498b-b03f-fe5106d271b7/pull/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.001759 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs_5dff47f0-9373-498b-b03f-fe5106d271b7/util/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.041327 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecadq5hs_5dff47f0-9373-498b-b03f-fe5106d271b7/extract/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.200518 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7fl6r_d2a09672-cb2b-4c7a-93f2-78a0e16d752f/marketplace-operator/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.222488 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4jxxx_f19f6400-b7f1-4a05-beb6-dc7ff4e23d71/extract-utilities/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.410653 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4jxxx_f19f6400-b7f1-4a05-beb6-dc7ff4e23d71/extract-content/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.414635 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4jxxx_f19f6400-b7f1-4a05-beb6-dc7ff4e23d71/extract-content/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.414654 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4jxxx_f19f6400-b7f1-4a05-beb6-dc7ff4e23d71/extract-utilities/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.612614 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4jxxx_f19f6400-b7f1-4a05-beb6-dc7ff4e23d71/extract-content/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.684525 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4jxxx_f19f6400-b7f1-4a05-beb6-dc7ff4e23d71/extract-utilities/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.771067 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4jxxx_f19f6400-b7f1-4a05-beb6-dc7ff4e23d71/registry-server/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.816586 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxdcj_e8935778-1ae8-4c0c-8189-e6240f5c2d23/extract-utilities/0.log" Feb 19 09:43:30 crc kubenswrapper[4788]: I0219 09:43:30.997750 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxdcj_e8935778-1ae8-4c0c-8189-e6240f5c2d23/extract-utilities/0.log" Feb 19 09:43:31 crc kubenswrapper[4788]: I0219 09:43:31.017129 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxdcj_e8935778-1ae8-4c0c-8189-e6240f5c2d23/extract-content/0.log" Feb 19 09:43:31 crc kubenswrapper[4788]: I0219 09:43:31.043573 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxdcj_e8935778-1ae8-4c0c-8189-e6240f5c2d23/extract-content/0.log" Feb 19 09:43:31 crc kubenswrapper[4788]: I0219 09:43:31.188546 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxdcj_e8935778-1ae8-4c0c-8189-e6240f5c2d23/extract-utilities/0.log" Feb 19 09:43:31 crc kubenswrapper[4788]: I0219 09:43:31.232588 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxdcj_e8935778-1ae8-4c0c-8189-e6240f5c2d23/extract-content/0.log" Feb 19 09:43:31 crc kubenswrapper[4788]: I0219 09:43:31.602281 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hxdcj_e8935778-1ae8-4c0c-8189-e6240f5c2d23/registry-server/0.log" Feb 19 09:43:52 crc kubenswrapper[4788]: I0219 09:43:52.138774 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:43:52 crc kubenswrapper[4788]: I0219 09:43:52.139351 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.118077 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b54dv"] Feb 19 09:44:10 crc kubenswrapper[4788]: E0219 09:44:10.119220 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerName="extract-utilities" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.119241 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerName="extract-utilities" Feb 19 09:44:10 crc kubenswrapper[4788]: E0219 09:44:10.119294 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerName="registry-server" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.119306 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerName="registry-server" Feb 19 09:44:10 crc kubenswrapper[4788]: E0219 09:44:10.119332 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerName="extract-content" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.119342 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerName="extract-content" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.119699 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c5def20-749c-4dae-9e1f-8d7b6e8d25ed" containerName="registry-server" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.121778 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.131124 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b54dv"] Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.186867 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-utilities\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.186929 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-catalog-content\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.187029 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkhnt\" (UniqueName: \"kubernetes.io/projected/d0a12d04-95d7-4291-9c60-a971833e1643-kube-api-access-gkhnt\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.289033 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkhnt\" (UniqueName: \"kubernetes.io/projected/d0a12d04-95d7-4291-9c60-a971833e1643-kube-api-access-gkhnt\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.289234 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-utilities\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.289317 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-catalog-content\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.290218 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-catalog-content\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.291143 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-utilities\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.314983 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkhnt\" (UniqueName: \"kubernetes.io/projected/d0a12d04-95d7-4291-9c60-a971833e1643-kube-api-access-gkhnt\") pod \"redhat-marketplace-b54dv\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:10 crc kubenswrapper[4788]: I0219 09:44:10.449333 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:11 crc kubenswrapper[4788]: I0219 09:44:11.019853 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b54dv"] Feb 19 09:44:11 crc kubenswrapper[4788]: I0219 09:44:11.752719 4788 generic.go:334] "Generic (PLEG): container finished" podID="d0a12d04-95d7-4291-9c60-a971833e1643" containerID="2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af" exitCode=0 Feb 19 09:44:11 crc kubenswrapper[4788]: I0219 09:44:11.752880 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b54dv" event={"ID":"d0a12d04-95d7-4291-9c60-a971833e1643","Type":"ContainerDied","Data":"2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af"} Feb 19 09:44:11 crc kubenswrapper[4788]: I0219 09:44:11.753084 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b54dv" event={"ID":"d0a12d04-95d7-4291-9c60-a971833e1643","Type":"ContainerStarted","Data":"b38660e20c2263f1a2dad98144a8bbe91a6bf836971bf351fcfa61f4a8070d4f"} Feb 19 09:44:11 crc kubenswrapper[4788]: I0219 09:44:11.755552 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:44:12 crc kubenswrapper[4788]: I0219 09:44:12.767284 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b54dv" event={"ID":"d0a12d04-95d7-4291-9c60-a971833e1643","Type":"ContainerStarted","Data":"1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807"} Feb 19 09:44:13 crc kubenswrapper[4788]: I0219 09:44:13.779674 4788 generic.go:334] "Generic (PLEG): container finished" podID="d0a12d04-95d7-4291-9c60-a971833e1643" containerID="1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807" exitCode=0 Feb 19 09:44:13 crc kubenswrapper[4788]: I0219 09:44:13.779723 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b54dv" event={"ID":"d0a12d04-95d7-4291-9c60-a971833e1643","Type":"ContainerDied","Data":"1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807"} Feb 19 09:44:14 crc kubenswrapper[4788]: I0219 09:44:14.799024 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b54dv" event={"ID":"d0a12d04-95d7-4291-9c60-a971833e1643","Type":"ContainerStarted","Data":"0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b"} Feb 19 09:44:14 crc kubenswrapper[4788]: I0219 09:44:14.839735 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b54dv" podStartSLOduration=2.2813191809999998 podStartE2EDuration="4.839706022s" podCreationTimestamp="2026-02-19 09:44:10 +0000 UTC" firstStartedPulling="2026-02-19 09:44:11.755148895 +0000 UTC m=+3553.743160377" lastFinishedPulling="2026-02-19 09:44:14.313535756 +0000 UTC m=+3556.301547218" observedRunningTime="2026-02-19 09:44:14.827347145 +0000 UTC m=+3556.815358647" watchObservedRunningTime="2026-02-19 09:44:14.839706022 +0000 UTC m=+3556.827717504" Feb 19 09:44:20 crc kubenswrapper[4788]: I0219 09:44:20.449848 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:20 crc kubenswrapper[4788]: I0219 09:44:20.450410 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:20 crc kubenswrapper[4788]: I0219 09:44:20.534600 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:20 crc kubenswrapper[4788]: I0219 09:44:20.906068 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:22 crc kubenswrapper[4788]: I0219 09:44:22.142131 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:44:22 crc kubenswrapper[4788]: I0219 09:44:22.142227 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:44:22 crc kubenswrapper[4788]: I0219 09:44:22.487042 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b54dv"] Feb 19 09:44:22 crc kubenswrapper[4788]: I0219 09:44:22.876422 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b54dv" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" containerName="registry-server" containerID="cri-o://0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b" gracePeriod=2 Feb 19 09:44:23 crc kubenswrapper[4788]: I0219 09:44:23.861811 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:23 crc kubenswrapper[4788]: I0219 09:44:23.899740 4788 generic.go:334] "Generic (PLEG): container finished" podID="d0a12d04-95d7-4291-9c60-a971833e1643" containerID="0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b" exitCode=0 Feb 19 09:44:23 crc kubenswrapper[4788]: I0219 09:44:23.899803 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b54dv" event={"ID":"d0a12d04-95d7-4291-9c60-a971833e1643","Type":"ContainerDied","Data":"0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b"} Feb 19 09:44:23 crc kubenswrapper[4788]: I0219 09:44:23.899843 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b54dv" Feb 19 09:44:23 crc kubenswrapper[4788]: I0219 09:44:23.899892 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b54dv" event={"ID":"d0a12d04-95d7-4291-9c60-a971833e1643","Type":"ContainerDied","Data":"b38660e20c2263f1a2dad98144a8bbe91a6bf836971bf351fcfa61f4a8070d4f"} Feb 19 09:44:23 crc kubenswrapper[4788]: I0219 09:44:23.899927 4788 scope.go:117] "RemoveContainer" containerID="0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b" Feb 19 09:44:23 crc kubenswrapper[4788]: I0219 09:44:23.941740 4788 scope.go:117] "RemoveContainer" containerID="1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807" Feb 19 09:44:23 crc kubenswrapper[4788]: I0219 09:44:23.965890 4788 scope.go:117] "RemoveContainer" containerID="2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.020788 4788 scope.go:117] "RemoveContainer" containerID="0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b" Feb 19 09:44:24 crc kubenswrapper[4788]: E0219 09:44:24.021491 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b\": container with ID starting with 0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b not found: ID does not exist" containerID="0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.021647 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b"} err="failed to get container status \"0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b\": rpc error: code = NotFound desc = could not find container \"0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b\": container with ID starting with 0b039e0f4fcbdf112e3e8ac051f71ac42dec555f53417d80be60ae13a3356c0b not found: ID does not exist" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.021734 4788 scope.go:117] "RemoveContainer" containerID="1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807" Feb 19 09:44:24 crc kubenswrapper[4788]: E0219 09:44:24.022632 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807\": container with ID starting with 1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807 not found: ID does not exist" containerID="1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.022680 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807"} err="failed to get container status \"1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807\": rpc error: code = NotFound desc = could not find container \"1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807\": container with ID starting with 1d0308354f839f106aa4e79bec0a26815841d02d973b9f78af3f89d2d5d82807 not found: ID does not exist" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.022713 4788 scope.go:117] "RemoveContainer" containerID="2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af" Feb 19 09:44:24 crc kubenswrapper[4788]: E0219 09:44:24.023335 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af\": container with ID starting with 2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af not found: ID does not exist" containerID="2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.023417 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af"} err="failed to get container status \"2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af\": rpc error: code = NotFound desc = could not find container \"2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af\": container with ID starting with 2ea36ea6b75f5beb2b17d21dc0fab4fbec154b0a055a5f405a6ceab3628500af not found: ID does not exist" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.024946 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-utilities\") pod \"d0a12d04-95d7-4291-9c60-a971833e1643\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.024992 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-catalog-content\") pod \"d0a12d04-95d7-4291-9c60-a971833e1643\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.025123 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkhnt\" (UniqueName: \"kubernetes.io/projected/d0a12d04-95d7-4291-9c60-a971833e1643-kube-api-access-gkhnt\") pod \"d0a12d04-95d7-4291-9c60-a971833e1643\" (UID: \"d0a12d04-95d7-4291-9c60-a971833e1643\") " Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.027071 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-utilities" (OuterVolumeSpecName: "utilities") pod "d0a12d04-95d7-4291-9c60-a971833e1643" (UID: "d0a12d04-95d7-4291-9c60-a971833e1643"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.033880 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a12d04-95d7-4291-9c60-a971833e1643-kube-api-access-gkhnt" (OuterVolumeSpecName: "kube-api-access-gkhnt") pod "d0a12d04-95d7-4291-9c60-a971833e1643" (UID: "d0a12d04-95d7-4291-9c60-a971833e1643"). InnerVolumeSpecName "kube-api-access-gkhnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.061068 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0a12d04-95d7-4291-9c60-a971833e1643" (UID: "d0a12d04-95d7-4291-9c60-a971833e1643"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.128676 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.128733 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0a12d04-95d7-4291-9c60-a971833e1643-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.128757 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkhnt\" (UniqueName: \"kubernetes.io/projected/d0a12d04-95d7-4291-9c60-a971833e1643-kube-api-access-gkhnt\") on node \"crc\" DevicePath \"\"" Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.267049 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b54dv"] Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.278470 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b54dv"] Feb 19 09:44:24 crc kubenswrapper[4788]: I0219 09:44:24.740062 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" path="/var/lib/kubelet/pods/d0a12d04-95d7-4291-9c60-a971833e1643/volumes" Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.139793 4788 patch_prober.go:28] interesting pod/machine-config-daemon-tftzx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.140687 4788 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.140819 4788 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.141976 4788 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8"} pod="openshift-machine-config-operator/machine-config-daemon-tftzx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.142124 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" containerName="machine-config-daemon" containerID="cri-o://7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" gracePeriod=600 Feb 19 09:44:52 crc kubenswrapper[4788]: E0219 09:44:52.282173 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.309139 4788 generic.go:334] "Generic (PLEG): container finished" podID="2c07881f-4511-4cd1-9283-6891826b57a1" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" exitCode=0 Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.309186 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerDied","Data":"7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8"} Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.309224 4788 scope.go:117] "RemoveContainer" containerID="3e559d6ada7592b7d324b68c3abdf68767d50a4eff6ea2aef554a1723fbf1a30" Feb 19 09:44:52 crc kubenswrapper[4788]: I0219 09:44:52.309821 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:44:52 crc kubenswrapper[4788]: E0219 09:44:52.310052 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.168509 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn"] Feb 19 09:45:00 crc kubenswrapper[4788]: E0219 09:45:00.169440 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" containerName="registry-server" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.169454 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" containerName="registry-server" Feb 19 09:45:00 crc kubenswrapper[4788]: E0219 09:45:00.169478 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" containerName="extract-utilities" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.169484 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" containerName="extract-utilities" Feb 19 09:45:00 crc kubenswrapper[4788]: E0219 09:45:00.169507 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" containerName="extract-content" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.169513 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" containerName="extract-content" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.169686 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a12d04-95d7-4291-9c60-a971833e1643" containerName="registry-server" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.171315 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.173328 4788 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.173626 4788 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.184697 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn"] Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.321342 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4l7\" (UniqueName: \"kubernetes.io/projected/da296119-9af4-47b8-be6e-b9b717bd5669-kube-api-access-lt4l7\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.321630 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da296119-9af4-47b8-be6e-b9b717bd5669-config-volume\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.321980 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da296119-9af4-47b8-be6e-b9b717bd5669-secret-volume\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.423806 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da296119-9af4-47b8-be6e-b9b717bd5669-secret-volume\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.423965 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4l7\" (UniqueName: \"kubernetes.io/projected/da296119-9af4-47b8-be6e-b9b717bd5669-kube-api-access-lt4l7\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.424073 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da296119-9af4-47b8-be6e-b9b717bd5669-config-volume\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.425736 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da296119-9af4-47b8-be6e-b9b717bd5669-config-volume\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.440227 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da296119-9af4-47b8-be6e-b9b717bd5669-secret-volume\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.461458 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4l7\" (UniqueName: \"kubernetes.io/projected/da296119-9af4-47b8-be6e-b9b717bd5669-kube-api-access-lt4l7\") pod \"collect-profiles-29524905-4ksnn\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:00 crc kubenswrapper[4788]: I0219 09:45:00.491644 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:01 crc kubenswrapper[4788]: I0219 09:45:01.110688 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn"] Feb 19 09:45:01 crc kubenswrapper[4788]: I0219 09:45:01.398167 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" event={"ID":"da296119-9af4-47b8-be6e-b9b717bd5669","Type":"ContainerStarted","Data":"79981731bbed81a6a3247fd3e8d11c2895072bdbd731ce17e000ddd2d16e0a04"} Feb 19 09:45:01 crc kubenswrapper[4788]: I0219 09:45:01.398237 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" event={"ID":"da296119-9af4-47b8-be6e-b9b717bd5669","Type":"ContainerStarted","Data":"700de8e0f85a999a2e16f400b00ac44aea4e6475cf93f6489041344c155c7b7f"} Feb 19 09:45:01 crc kubenswrapper[4788]: I0219 09:45:01.416909 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" podStartSLOduration=1.416892697 podStartE2EDuration="1.416892697s" podCreationTimestamp="2026-02-19 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-19 09:45:01.414828296 +0000 UTC m=+3603.402839808" watchObservedRunningTime="2026-02-19 09:45:01.416892697 +0000 UTC m=+3603.404904169" Feb 19 09:45:02 crc kubenswrapper[4788]: I0219 09:45:02.411636 4788 generic.go:334] "Generic (PLEG): container finished" podID="da296119-9af4-47b8-be6e-b9b717bd5669" containerID="79981731bbed81a6a3247fd3e8d11c2895072bdbd731ce17e000ddd2d16e0a04" exitCode=0 Feb 19 09:45:02 crc kubenswrapper[4788]: I0219 09:45:02.411681 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" event={"ID":"da296119-9af4-47b8-be6e-b9b717bd5669","Type":"ContainerDied","Data":"79981731bbed81a6a3247fd3e8d11c2895072bdbd731ce17e000ddd2d16e0a04"} Feb 19 09:45:03 crc kubenswrapper[4788]: I0219 09:45:03.844631 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.003878 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da296119-9af4-47b8-be6e-b9b717bd5669-config-volume\") pod \"da296119-9af4-47b8-be6e-b9b717bd5669\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.004072 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt4l7\" (UniqueName: \"kubernetes.io/projected/da296119-9af4-47b8-be6e-b9b717bd5669-kube-api-access-lt4l7\") pod \"da296119-9af4-47b8-be6e-b9b717bd5669\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.004186 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da296119-9af4-47b8-be6e-b9b717bd5669-secret-volume\") pod \"da296119-9af4-47b8-be6e-b9b717bd5669\" (UID: \"da296119-9af4-47b8-be6e-b9b717bd5669\") " Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.004545 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da296119-9af4-47b8-be6e-b9b717bd5669-config-volume" (OuterVolumeSpecName: "config-volume") pod "da296119-9af4-47b8-be6e-b9b717bd5669" (UID: "da296119-9af4-47b8-be6e-b9b717bd5669"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.004899 4788 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da296119-9af4-47b8-be6e-b9b717bd5669-config-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.014170 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da296119-9af4-47b8-be6e-b9b717bd5669-kube-api-access-lt4l7" (OuterVolumeSpecName: "kube-api-access-lt4l7") pod "da296119-9af4-47b8-be6e-b9b717bd5669" (UID: "da296119-9af4-47b8-be6e-b9b717bd5669"). InnerVolumeSpecName "kube-api-access-lt4l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.025611 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da296119-9af4-47b8-be6e-b9b717bd5669-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da296119-9af4-47b8-be6e-b9b717bd5669" (UID: "da296119-9af4-47b8-be6e-b9b717bd5669"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.106583 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt4l7\" (UniqueName: \"kubernetes.io/projected/da296119-9af4-47b8-be6e-b9b717bd5669-kube-api-access-lt4l7\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.106622 4788 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da296119-9af4-47b8-be6e-b9b717bd5669-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.434035 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" event={"ID":"da296119-9af4-47b8-be6e-b9b717bd5669","Type":"ContainerDied","Data":"700de8e0f85a999a2e16f400b00ac44aea4e6475cf93f6489041344c155c7b7f"} Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.434088 4788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="700de8e0f85a999a2e16f400b00ac44aea4e6475cf93f6489041344c155c7b7f" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.434152 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29524905-4ksnn" Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.506961 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd"] Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.514654 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29524860-bvlmd"] Feb 19 09:45:04 crc kubenswrapper[4788]: I0219 09:45:04.735579 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9b0944-a77d-4a9c-9d2c-f423333e62a0" path="/var/lib/kubelet/pods/0a9b0944-a77d-4a9c-9d2c-f423333e62a0/volumes" Feb 19 09:45:06 crc kubenswrapper[4788]: I0219 09:45:06.714449 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:45:06 crc kubenswrapper[4788]: E0219 09:45:06.715015 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:45:17 crc kubenswrapper[4788]: I0219 09:45:17.715890 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:45:17 crc kubenswrapper[4788]: E0219 09:45:17.716738 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:45:22 crc kubenswrapper[4788]: I0219 09:45:22.622371 4788 generic.go:334] "Generic (PLEG): container finished" podID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerID="5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8" exitCode=0 Feb 19 09:45:22 crc kubenswrapper[4788]: I0219 09:45:22.622475 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sb7nx/must-gather-x22vs" event={"ID":"e2edf248-6e05-430c-9c6e-070d15cbb9b9","Type":"ContainerDied","Data":"5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8"} Feb 19 09:45:22 crc kubenswrapper[4788]: I0219 09:45:22.623964 4788 scope.go:117] "RemoveContainer" containerID="5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8" Feb 19 09:45:22 crc kubenswrapper[4788]: I0219 09:45:22.827542 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sb7nx_must-gather-x22vs_e2edf248-6e05-430c-9c6e-070d15cbb9b9/gather/0.log" Feb 19 09:45:24 crc kubenswrapper[4788]: I0219 09:45:24.728619 4788 scope.go:117] "RemoveContainer" containerID="5395284b6654e0eac4bd84a33d45b6771543388a579b8ea1b996fab350a066ec" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.031283 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sb7nx/must-gather-x22vs"] Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.032079 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-sb7nx/must-gather-x22vs" podUID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerName="copy" containerID="cri-o://4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb" gracePeriod=2 Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.040998 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sb7nx/must-gather-x22vs"] Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.449497 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sb7nx_must-gather-x22vs_e2edf248-6e05-430c-9c6e-070d15cbb9b9/copy/0.log" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.450298 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.477474 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2edf248-6e05-430c-9c6e-070d15cbb9b9-must-gather-output\") pod \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\" (UID: \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\") " Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.477764 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjktx\" (UniqueName: \"kubernetes.io/projected/e2edf248-6e05-430c-9c6e-070d15cbb9b9-kube-api-access-mjktx\") pod \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\" (UID: \"e2edf248-6e05-430c-9c6e-070d15cbb9b9\") " Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.488177 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2edf248-6e05-430c-9c6e-070d15cbb9b9-kube-api-access-mjktx" (OuterVolumeSpecName: "kube-api-access-mjktx") pod "e2edf248-6e05-430c-9c6e-070d15cbb9b9" (UID: "e2edf248-6e05-430c-9c6e-070d15cbb9b9"). InnerVolumeSpecName "kube-api-access-mjktx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.579792 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjktx\" (UniqueName: \"kubernetes.io/projected/e2edf248-6e05-430c-9c6e-070d15cbb9b9-kube-api-access-mjktx\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.655541 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2edf248-6e05-430c-9c6e-070d15cbb9b9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e2edf248-6e05-430c-9c6e-070d15cbb9b9" (UID: "e2edf248-6e05-430c-9c6e-070d15cbb9b9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.682075 4788 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e2edf248-6e05-430c-9c6e-070d15cbb9b9-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.706262 4788 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sb7nx_must-gather-x22vs_e2edf248-6e05-430c-9c6e-070d15cbb9b9/copy/0.log" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.706692 4788 generic.go:334] "Generic (PLEG): container finished" podID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerID="4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb" exitCode=143 Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.706775 4788 scope.go:117] "RemoveContainer" containerID="4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.706740 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sb7nx/must-gather-x22vs" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.715413 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:45:31 crc kubenswrapper[4788]: E0219 09:45:31.715713 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.739664 4788 scope.go:117] "RemoveContainer" containerID="5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.838893 4788 scope.go:117] "RemoveContainer" containerID="4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb" Feb 19 09:45:31 crc kubenswrapper[4788]: E0219 09:45:31.842576 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb\": container with ID starting with 4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb not found: ID does not exist" containerID="4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.842604 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb"} err="failed to get container status \"4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb\": rpc error: code = NotFound desc = could not find container \"4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb\": container with ID starting with 4b095564d14bcded1dc076af3f3f2bca4e2077966c4c9ff7b6fbc79957258feb not found: ID does not exist" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.842626 4788 scope.go:117] "RemoveContainer" containerID="5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8" Feb 19 09:45:31 crc kubenswrapper[4788]: E0219 09:45:31.843020 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8\": container with ID starting with 5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8 not found: ID does not exist" containerID="5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8" Feb 19 09:45:31 crc kubenswrapper[4788]: I0219 09:45:31.843040 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8"} err="failed to get container status \"5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8\": rpc error: code = NotFound desc = could not find container \"5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8\": container with ID starting with 5755417cd1a8943c0fa20b385b0ab4652a3fb97f0d9632838a27fe6406f177e8 not found: ID does not exist" Feb 19 09:45:32 crc kubenswrapper[4788]: I0219 09:45:32.735422 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" path="/var/lib/kubelet/pods/e2edf248-6e05-430c-9c6e-070d15cbb9b9/volumes" Feb 19 09:45:44 crc kubenswrapper[4788]: I0219 09:45:44.714781 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:45:44 crc kubenswrapper[4788]: E0219 09:45:44.715617 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:45:55 crc kubenswrapper[4788]: I0219 09:45:55.714753 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:45:55 crc kubenswrapper[4788]: E0219 09:45:55.716071 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:46:07 crc kubenswrapper[4788]: I0219 09:46:07.715367 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:46:07 crc kubenswrapper[4788]: E0219 09:46:07.716355 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:46:21 crc kubenswrapper[4788]: I0219 09:46:21.714161 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:46:21 crc kubenswrapper[4788]: E0219 09:46:21.714983 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:46:36 crc kubenswrapper[4788]: I0219 09:46:36.715351 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:46:36 crc kubenswrapper[4788]: E0219 09:46:36.716365 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:46:47 crc kubenswrapper[4788]: I0219 09:46:47.714406 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:46:47 crc kubenswrapper[4788]: E0219 09:46:47.715322 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:47:00 crc kubenswrapper[4788]: I0219 09:47:00.722024 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:47:00 crc kubenswrapper[4788]: E0219 09:47:00.722609 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:47:14 crc kubenswrapper[4788]: I0219 09:47:14.714135 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:47:14 crc kubenswrapper[4788]: E0219 09:47:14.714806 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:47:27 crc kubenswrapper[4788]: I0219 09:47:27.714415 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:47:27 crc kubenswrapper[4788]: E0219 09:47:27.715236 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:47:40 crc kubenswrapper[4788]: I0219 09:47:40.717054 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:47:40 crc kubenswrapper[4788]: E0219 09:47:40.718185 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:47:54 crc kubenswrapper[4788]: I0219 09:47:54.714004 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:47:54 crc kubenswrapper[4788]: E0219 09:47:54.714649 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:48:06 crc kubenswrapper[4788]: I0219 09:48:06.714671 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:48:06 crc kubenswrapper[4788]: E0219 09:48:06.715403 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:48:17 crc kubenswrapper[4788]: I0219 09:48:17.715233 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:48:17 crc kubenswrapper[4788]: E0219 09:48:17.716182 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:48:31 crc kubenswrapper[4788]: I0219 09:48:31.715176 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:48:31 crc kubenswrapper[4788]: E0219 09:48:31.716157 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:48:46 crc kubenswrapper[4788]: I0219 09:48:46.715442 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:48:46 crc kubenswrapper[4788]: E0219 09:48:46.716903 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:48:59 crc kubenswrapper[4788]: I0219 09:48:59.715053 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:48:59 crc kubenswrapper[4788]: E0219 09:48:59.715785 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:49:14 crc kubenswrapper[4788]: I0219 09:49:14.714039 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:49:14 crc kubenswrapper[4788]: E0219 09:49:14.714734 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.551826 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-52qjq"] Feb 19 09:49:27 crc kubenswrapper[4788]: E0219 09:49:27.552712 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerName="copy" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.552726 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerName="copy" Feb 19 09:49:27 crc kubenswrapper[4788]: E0219 09:49:27.552741 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerName="gather" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.552747 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerName="gather" Feb 19 09:49:27 crc kubenswrapper[4788]: E0219 09:49:27.552780 4788 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da296119-9af4-47b8-be6e-b9b717bd5669" containerName="collect-profiles" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.552786 4788 state_mem.go:107] "Deleted CPUSet assignment" podUID="da296119-9af4-47b8-be6e-b9b717bd5669" containerName="collect-profiles" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.552960 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerName="gather" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.552987 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="da296119-9af4-47b8-be6e-b9b717bd5669" containerName="collect-profiles" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.553024 4788 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2edf248-6e05-430c-9c6e-070d15cbb9b9" containerName="copy" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.554591 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.560189 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52qjq"] Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.699069 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wt2p\" (UniqueName: \"kubernetes.io/projected/70b146bb-8c02-4c3e-af4e-192e857a3e58-kube-api-access-2wt2p\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.699146 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-utilities\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.699212 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-catalog-content\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.803037 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wt2p\" (UniqueName: \"kubernetes.io/projected/70b146bb-8c02-4c3e-af4e-192e857a3e58-kube-api-access-2wt2p\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.803135 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-utilities\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.803234 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-catalog-content\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.803939 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-catalog-content\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.804276 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-utilities\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.835050 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wt2p\" (UniqueName: \"kubernetes.io/projected/70b146bb-8c02-4c3e-af4e-192e857a3e58-kube-api-access-2wt2p\") pod \"redhat-operators-52qjq\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:27 crc kubenswrapper[4788]: I0219 09:49:27.874750 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:28 crc kubenswrapper[4788]: I0219 09:49:28.354196 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52qjq"] Feb 19 09:49:28 crc kubenswrapper[4788]: I0219 09:49:28.721907 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:49:28 crc kubenswrapper[4788]: E0219 09:49:28.722237 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:49:29 crc kubenswrapper[4788]: I0219 09:49:29.170428 4788 generic.go:334] "Generic (PLEG): container finished" podID="70b146bb-8c02-4c3e-af4e-192e857a3e58" containerID="934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9" exitCode=0 Feb 19 09:49:29 crc kubenswrapper[4788]: I0219 09:49:29.170490 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qjq" event={"ID":"70b146bb-8c02-4c3e-af4e-192e857a3e58","Type":"ContainerDied","Data":"934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9"} Feb 19 09:49:29 crc kubenswrapper[4788]: I0219 09:49:29.170529 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qjq" event={"ID":"70b146bb-8c02-4c3e-af4e-192e857a3e58","Type":"ContainerStarted","Data":"8f461fda9421610391049d066a09f9fb421525b5f3bad782ae340bb1a157fb72"} Feb 19 09:49:29 crc kubenswrapper[4788]: I0219 09:49:29.172876 4788 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 19 09:49:30 crc kubenswrapper[4788]: I0219 09:49:30.180889 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qjq" event={"ID":"70b146bb-8c02-4c3e-af4e-192e857a3e58","Type":"ContainerStarted","Data":"57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6"} Feb 19 09:49:34 crc kubenswrapper[4788]: I0219 09:49:34.222208 4788 generic.go:334] "Generic (PLEG): container finished" podID="70b146bb-8c02-4c3e-af4e-192e857a3e58" containerID="57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6" exitCode=0 Feb 19 09:49:34 crc kubenswrapper[4788]: I0219 09:49:34.222488 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qjq" event={"ID":"70b146bb-8c02-4c3e-af4e-192e857a3e58","Type":"ContainerDied","Data":"57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6"} Feb 19 09:49:35 crc kubenswrapper[4788]: I0219 09:49:35.233725 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qjq" event={"ID":"70b146bb-8c02-4c3e-af4e-192e857a3e58","Type":"ContainerStarted","Data":"22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c"} Feb 19 09:49:35 crc kubenswrapper[4788]: I0219 09:49:35.259035 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-52qjq" podStartSLOduration=2.716575353 podStartE2EDuration="8.259015097s" podCreationTimestamp="2026-02-19 09:49:27 +0000 UTC" firstStartedPulling="2026-02-19 09:49:29.172680553 +0000 UTC m=+3871.160692015" lastFinishedPulling="2026-02-19 09:49:34.715120277 +0000 UTC m=+3876.703131759" observedRunningTime="2026-02-19 09:49:35.251506252 +0000 UTC m=+3877.239517724" watchObservedRunningTime="2026-02-19 09:49:35.259015097 +0000 UTC m=+3877.247026569" Feb 19 09:49:37 crc kubenswrapper[4788]: I0219 09:49:37.874987 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:37 crc kubenswrapper[4788]: I0219 09:49:37.875234 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:38 crc kubenswrapper[4788]: I0219 09:49:38.940851 4788 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-52qjq" podUID="70b146bb-8c02-4c3e-af4e-192e857a3e58" containerName="registry-server" probeResult="failure" output=< Feb 19 09:49:38 crc kubenswrapper[4788]: timeout: failed to connect service ":50051" within 1s Feb 19 09:49:38 crc kubenswrapper[4788]: > Feb 19 09:49:42 crc kubenswrapper[4788]: I0219 09:49:42.715051 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:49:42 crc kubenswrapper[4788]: E0219 09:49:42.715918 4788 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-tftzx_openshift-machine-config-operator(2c07881f-4511-4cd1-9283-6891826b57a1)\"" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" podUID="2c07881f-4511-4cd1-9283-6891826b57a1" Feb 19 09:49:45 crc kubenswrapper[4788]: I0219 09:49:45.925057 4788 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vf6h6"] Feb 19 09:49:45 crc kubenswrapper[4788]: I0219 09:49:45.935481 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:45 crc kubenswrapper[4788]: I0219 09:49:45.944009 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vf6h6"] Feb 19 09:49:45 crc kubenswrapper[4788]: I0219 09:49:45.957655 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-utilities\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:45 crc kubenswrapper[4788]: I0219 09:49:45.957776 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcbj7\" (UniqueName: \"kubernetes.io/projected/865d0ad4-6c0f-407c-af48-c28149f71fc3-kube-api-access-mcbj7\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:45 crc kubenswrapper[4788]: I0219 09:49:45.957996 4788 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-catalog-content\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:46 crc kubenswrapper[4788]: I0219 09:49:46.059307 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-catalog-content\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:46 crc kubenswrapper[4788]: I0219 09:49:46.059393 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-utilities\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:46 crc kubenswrapper[4788]: I0219 09:49:46.059465 4788 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcbj7\" (UniqueName: \"kubernetes.io/projected/865d0ad4-6c0f-407c-af48-c28149f71fc3-kube-api-access-mcbj7\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:46 crc kubenswrapper[4788]: I0219 09:49:46.060538 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-catalog-content\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:46 crc kubenswrapper[4788]: I0219 09:49:46.060752 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-utilities\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:46 crc kubenswrapper[4788]: I0219 09:49:46.087669 4788 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcbj7\" (UniqueName: \"kubernetes.io/projected/865d0ad4-6c0f-407c-af48-c28149f71fc3-kube-api-access-mcbj7\") pod \"certified-operators-vf6h6\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:46 crc kubenswrapper[4788]: I0219 09:49:46.274577 4788 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:46 crc kubenswrapper[4788]: I0219 09:49:46.759538 4788 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vf6h6"] Feb 19 09:49:47 crc kubenswrapper[4788]: I0219 09:49:47.363160 4788 generic.go:334] "Generic (PLEG): container finished" podID="865d0ad4-6c0f-407c-af48-c28149f71fc3" containerID="b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16" exitCode=0 Feb 19 09:49:47 crc kubenswrapper[4788]: I0219 09:49:47.363215 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf6h6" event={"ID":"865d0ad4-6c0f-407c-af48-c28149f71fc3","Type":"ContainerDied","Data":"b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16"} Feb 19 09:49:47 crc kubenswrapper[4788]: I0219 09:49:47.363502 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf6h6" event={"ID":"865d0ad4-6c0f-407c-af48-c28149f71fc3","Type":"ContainerStarted","Data":"ee58a0bb7e9bd41bab47b150a779950971fb76633ff70ee62edbc623bf954470"} Feb 19 09:49:47 crc kubenswrapper[4788]: I0219 09:49:47.924170 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:47 crc kubenswrapper[4788]: I0219 09:49:47.979192 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:49 crc kubenswrapper[4788]: I0219 09:49:49.389629 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf6h6" event={"ID":"865d0ad4-6c0f-407c-af48-c28149f71fc3","Type":"ContainerStarted","Data":"e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc"} Feb 19 09:49:50 crc kubenswrapper[4788]: I0219 09:49:50.297971 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52qjq"] Feb 19 09:49:50 crc kubenswrapper[4788]: I0219 09:49:50.298506 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-52qjq" podUID="70b146bb-8c02-4c3e-af4e-192e857a3e58" containerName="registry-server" containerID="cri-o://22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c" gracePeriod=2 Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.287605 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.380994 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wt2p\" (UniqueName: \"kubernetes.io/projected/70b146bb-8c02-4c3e-af4e-192e857a3e58-kube-api-access-2wt2p\") pod \"70b146bb-8c02-4c3e-af4e-192e857a3e58\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.381094 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-catalog-content\") pod \"70b146bb-8c02-4c3e-af4e-192e857a3e58\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.381158 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-utilities\") pod \"70b146bb-8c02-4c3e-af4e-192e857a3e58\" (UID: \"70b146bb-8c02-4c3e-af4e-192e857a3e58\") " Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.386790 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-utilities" (OuterVolumeSpecName: "utilities") pod "70b146bb-8c02-4c3e-af4e-192e857a3e58" (UID: "70b146bb-8c02-4c3e-af4e-192e857a3e58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.389853 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70b146bb-8c02-4c3e-af4e-192e857a3e58-kube-api-access-2wt2p" (OuterVolumeSpecName: "kube-api-access-2wt2p") pod "70b146bb-8c02-4c3e-af4e-192e857a3e58" (UID: "70b146bb-8c02-4c3e-af4e-192e857a3e58"). InnerVolumeSpecName "kube-api-access-2wt2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.410739 4788 generic.go:334] "Generic (PLEG): container finished" podID="70b146bb-8c02-4c3e-af4e-192e857a3e58" containerID="22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c" exitCode=0 Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.410804 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qjq" event={"ID":"70b146bb-8c02-4c3e-af4e-192e857a3e58","Type":"ContainerDied","Data":"22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c"} Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.410828 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52qjq" event={"ID":"70b146bb-8c02-4c3e-af4e-192e857a3e58","Type":"ContainerDied","Data":"8f461fda9421610391049d066a09f9fb421525b5f3bad782ae340bb1a157fb72"} Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.410846 4788 scope.go:117] "RemoveContainer" containerID="22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.410959 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52qjq" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.415815 4788 generic.go:334] "Generic (PLEG): container finished" podID="865d0ad4-6c0f-407c-af48-c28149f71fc3" containerID="e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc" exitCode=0 Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.415848 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf6h6" event={"ID":"865d0ad4-6c0f-407c-af48-c28149f71fc3","Type":"ContainerDied","Data":"e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc"} Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.468174 4788 scope.go:117] "RemoveContainer" containerID="57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.484059 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wt2p\" (UniqueName: \"kubernetes.io/projected/70b146bb-8c02-4c3e-af4e-192e857a3e58-kube-api-access-2wt2p\") on node \"crc\" DevicePath \"\"" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.484102 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.494523 4788 scope.go:117] "RemoveContainer" containerID="934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.500669 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70b146bb-8c02-4c3e-af4e-192e857a3e58" (UID: "70b146bb-8c02-4c3e-af4e-192e857a3e58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.541948 4788 scope.go:117] "RemoveContainer" containerID="22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c" Feb 19 09:49:51 crc kubenswrapper[4788]: E0219 09:49:51.542389 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c\": container with ID starting with 22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c not found: ID does not exist" containerID="22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.542426 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c"} err="failed to get container status \"22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c\": rpc error: code = NotFound desc = could not find container \"22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c\": container with ID starting with 22baa1f61535f0b4102ebde2a277acd7da300165a88eaf327d31070193a0fa9c not found: ID does not exist" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.542448 4788 scope.go:117] "RemoveContainer" containerID="57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6" Feb 19 09:49:51 crc kubenswrapper[4788]: E0219 09:49:51.542756 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6\": container with ID starting with 57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6 not found: ID does not exist" containerID="57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.542774 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6"} err="failed to get container status \"57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6\": rpc error: code = NotFound desc = could not find container \"57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6\": container with ID starting with 57ef33db2be9edc5cd46634ef2dbeda152c7dfeb61a28013a185345bf01a65d6 not found: ID does not exist" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.542787 4788 scope.go:117] "RemoveContainer" containerID="934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9" Feb 19 09:49:51 crc kubenswrapper[4788]: E0219 09:49:51.543063 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9\": container with ID starting with 934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9 not found: ID does not exist" containerID="934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.543081 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9"} err="failed to get container status \"934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9\": rpc error: code = NotFound desc = could not find container \"934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9\": container with ID starting with 934bc09a4d26dbc27b64ef7bf50f1dbd9b41596010ea89b03b717b1ac8999de9 not found: ID does not exist" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.585883 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70b146bb-8c02-4c3e-af4e-192e857a3e58-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.777111 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52qjq"] Feb 19 09:49:51 crc kubenswrapper[4788]: I0219 09:49:51.790277 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-52qjq"] Feb 19 09:49:52 crc kubenswrapper[4788]: I0219 09:49:52.430205 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf6h6" event={"ID":"865d0ad4-6c0f-407c-af48-c28149f71fc3","Type":"ContainerStarted","Data":"9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60"} Feb 19 09:49:52 crc kubenswrapper[4788]: I0219 09:49:52.460789 4788 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vf6h6" podStartSLOduration=2.970458258 podStartE2EDuration="7.460737958s" podCreationTimestamp="2026-02-19 09:49:45 +0000 UTC" firstStartedPulling="2026-02-19 09:49:47.366010172 +0000 UTC m=+3889.354021674" lastFinishedPulling="2026-02-19 09:49:51.856289872 +0000 UTC m=+3893.844301374" observedRunningTime="2026-02-19 09:49:52.459771724 +0000 UTC m=+3894.447783196" watchObservedRunningTime="2026-02-19 09:49:52.460737958 +0000 UTC m=+3894.448749470" Feb 19 09:49:52 crc kubenswrapper[4788]: I0219 09:49:52.732662 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70b146bb-8c02-4c3e-af4e-192e857a3e58" path="/var/lib/kubelet/pods/70b146bb-8c02-4c3e-af4e-192e857a3e58/volumes" Feb 19 09:49:56 crc kubenswrapper[4788]: I0219 09:49:56.275058 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:56 crc kubenswrapper[4788]: I0219 09:49:56.275972 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:56 crc kubenswrapper[4788]: I0219 09:49:56.355012 4788 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:49:57 crc kubenswrapper[4788]: I0219 09:49:57.715029 4788 scope.go:117] "RemoveContainer" containerID="7fd63491cd7f6cea4d076bcc429df6bbece82ce2c0957e4486a45f01d860beb8" Feb 19 09:49:58 crc kubenswrapper[4788]: I0219 09:49:58.502792 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-tftzx" event={"ID":"2c07881f-4511-4cd1-9283-6891826b57a1","Type":"ContainerStarted","Data":"7b6c03f7ae617055a8660c1c7e855ecc39a8ea90ffa44e66b9773a336c53bcfd"} Feb 19 09:50:06 crc kubenswrapper[4788]: I0219 09:50:06.358460 4788 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:50:06 crc kubenswrapper[4788]: I0219 09:50:06.457538 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vf6h6"] Feb 19 09:50:06 crc kubenswrapper[4788]: I0219 09:50:06.598258 4788 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vf6h6" podUID="865d0ad4-6c0f-407c-af48-c28149f71fc3" containerName="registry-server" containerID="cri-o://9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60" gracePeriod=2 Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.046968 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.222824 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-catalog-content\") pod \"865d0ad4-6c0f-407c-af48-c28149f71fc3\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.223081 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcbj7\" (UniqueName: \"kubernetes.io/projected/865d0ad4-6c0f-407c-af48-c28149f71fc3-kube-api-access-mcbj7\") pod \"865d0ad4-6c0f-407c-af48-c28149f71fc3\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.223393 4788 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-utilities\") pod \"865d0ad4-6c0f-407c-af48-c28149f71fc3\" (UID: \"865d0ad4-6c0f-407c-af48-c28149f71fc3\") " Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.224547 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-utilities" (OuterVolumeSpecName: "utilities") pod "865d0ad4-6c0f-407c-af48-c28149f71fc3" (UID: "865d0ad4-6c0f-407c-af48-c28149f71fc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.240444 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865d0ad4-6c0f-407c-af48-c28149f71fc3-kube-api-access-mcbj7" (OuterVolumeSpecName: "kube-api-access-mcbj7") pod "865d0ad4-6c0f-407c-af48-c28149f71fc3" (UID: "865d0ad4-6c0f-407c-af48-c28149f71fc3"). InnerVolumeSpecName "kube-api-access-mcbj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.285362 4788 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "865d0ad4-6c0f-407c-af48-c28149f71fc3" (UID: "865d0ad4-6c0f-407c-af48-c28149f71fc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.325110 4788 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.325138 4788 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcbj7\" (UniqueName: \"kubernetes.io/projected/865d0ad4-6c0f-407c-af48-c28149f71fc3-kube-api-access-mcbj7\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.325147 4788 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865d0ad4-6c0f-407c-af48-c28149f71fc3-utilities\") on node \"crc\" DevicePath \"\"" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.619923 4788 generic.go:334] "Generic (PLEG): container finished" podID="865d0ad4-6c0f-407c-af48-c28149f71fc3" containerID="9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60" exitCode=0 Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.619987 4788 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vf6h6" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.620008 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf6h6" event={"ID":"865d0ad4-6c0f-407c-af48-c28149f71fc3","Type":"ContainerDied","Data":"9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60"} Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.620436 4788 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vf6h6" event={"ID":"865d0ad4-6c0f-407c-af48-c28149f71fc3","Type":"ContainerDied","Data":"ee58a0bb7e9bd41bab47b150a779950971fb76633ff70ee62edbc623bf954470"} Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.620457 4788 scope.go:117] "RemoveContainer" containerID="9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.676540 4788 scope.go:117] "RemoveContainer" containerID="e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.708094 4788 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vf6h6"] Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.723796 4788 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vf6h6"] Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.744362 4788 scope.go:117] "RemoveContainer" containerID="b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.812722 4788 scope.go:117] "RemoveContainer" containerID="9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60" Feb 19 09:50:07 crc kubenswrapper[4788]: E0219 09:50:07.818735 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60\": container with ID starting with 9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60 not found: ID does not exist" containerID="9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.818789 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60"} err="failed to get container status \"9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60\": rpc error: code = NotFound desc = could not find container \"9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60\": container with ID starting with 9a97f0a6be345028ce40a80e2dcb9027348bdde2af05693c5d3ecc21bcd65d60 not found: ID does not exist" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.818821 4788 scope.go:117] "RemoveContainer" containerID="e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc" Feb 19 09:50:07 crc kubenswrapper[4788]: E0219 09:50:07.826995 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc\": container with ID starting with e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc not found: ID does not exist" containerID="e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.827045 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc"} err="failed to get container status \"e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc\": rpc error: code = NotFound desc = could not find container \"e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc\": container with ID starting with e3d998f72fa8716bdec6ca1117337187bd921ba8bf965b5437c0c9af872bb7fc not found: ID does not exist" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.827075 4788 scope.go:117] "RemoveContainer" containerID="b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16" Feb 19 09:50:07 crc kubenswrapper[4788]: E0219 09:50:07.832187 4788 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16\": container with ID starting with b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16 not found: ID does not exist" containerID="b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16" Feb 19 09:50:07 crc kubenswrapper[4788]: I0219 09:50:07.832255 4788 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16"} err="failed to get container status \"b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16\": rpc error: code = NotFound desc = could not find container \"b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16\": container with ID starting with b22f5278051cd44257603d2a68327bb9ce19057304954bdb7931406eb57c8c16 not found: ID does not exist" Feb 19 09:50:08 crc kubenswrapper[4788]: I0219 09:50:08.769686 4788 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865d0ad4-6c0f-407c-af48-c28149f71fc3" path="/var/lib/kubelet/pods/865d0ad4-6c0f-407c-af48-c28149f71fc3/volumes" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515145556425024461 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015145556426017377 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015145546302016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015145546303015463 5ustar corecore